7. Abundant materials

This page contains Chapter 7 from
Sustainable Superabundance: A universal transhumanist manifesto for the 2020s and beyond

Abundant materials

Note: The text of this chapter of the Manifesto is draft and is presently undergoing regular revision.

To make comments or suggest changes in the text, please use this shared document.

7. Abundant materials

One key task that lies ahead is the development and refinement of technologies capable of providing everyone with sufficient material goods for a life of sustainable superabundance.

Central to this task is the area of technology known as nanotechnology. Nanotechnology has particularly far-reaching implications – including new methods of manufacturing, new methods of repair, and new methods of recycling. These methods will boost the vitality and resilience, not only of individual humans, but of the material infrastructure within which we all operate. As a result, we’ll all be better protected. We’ll no longer need to worry about shortages, or about materials corroding, warping, or disintegrating. Thanks to nanotechnology, we’ll have plenty for all our needs.

Approaching nanotechnology

Nanotechnology is the deliberate systematic mechanical manipulation of matter at the nanoscale, that is, at dimensions of around one to a hundred nanometres. A nanometre (nm) is a billionth of a metre, that is, a millionth of a millimetre. For comparison, a human red blood cell is about 8000 nm in diameter. A small bacterium has width around 200 nm, whilst a small virus is around 30 nm. An individual amino acid is just under one nanometre in width, and a water molecule is around a quarter of a nanometre. Accordingly, nanotechnology operates at the scale of individual molecules. In particular, nanotechnology creates and utilises a rich set of nanoscale levers, shafts, conveyor belts, gears, pulleys, motors, and more.

One type of nanotechnology has been taking place inside biological cells for billions of years. In this “natural nanotechnology”, a marvellous dance of chemical reactions reliably assembles various different proteins, molecule by molecule, following codes stored in DNA and RNA. The vision of “synthetic nanotechnology” is that specially designed nanofactories will be able, in a broadly similar way, to utilise atomically precise engineering to construct numerous kinds of new material products, molecule by molecule. But whereas natural nanotechnology involves processes that evolved by blind evolution, synthetic nanotechnology will involve processes intelligently designed by human scientists. These scientists will take inspiration from biological templates, but they look forward to reaching results far transcending those of nature.

The revolutionary potential of nanotechnology was popularised by Eric Drexler in his 1986 book “Engines of Creation: The Coming Era of Nanotechnology”. That book fired the imagination of a surge of readers around the world. Since that time, however, progress with many of the ideas Drexler envisioned has proven disappointingly slow.

Transhumanists anticipate that the long period in which progress has been disappointingly slow can soon give way to a period of much swifter accomplishment. However, there is nothing inevitable about such a transition. It is the responsibility of transhumanists to make the case for greater funding for the field, so that the many remarkable potential benefits of nanotechnology will be realised more quickly, accelerating the attainment of the era of sustainable superabundance.

Tools that improve tools

The story of human progress can be expressed as the story of improving tools. Tools magnify our capabilities. The more powerful our tools become, the greater is our ability to reshape our environment – and ourselves.

At the dawn of humanity, our tools were rudimentary. As millennia passed, our tools gradually became more refined, as humanity gained greater prowess in manipulating stones, twine, wood, feathers, fur, bones, leather, and more. These tools helped, not only in hunting, fishing, and farming – and not only in the creation and maintenance of clothing and shelter – but in the production of yet more tools. Better tools made it possible, given time and ingenuity, to create even better tools.

In this way, as the stone age gave way to the bronze age and then to the iron age, basic tools helped to improve the process of mining and smelting new metals, which could in turn be incorporated in the next generation of tools.

The positive feedback cycle of tools creating better tools gathered pace with the industrial revolution, as steam engines amplified and complemented human muscle power. Within a couple of centuries, additional impetus was available from electrical motors, factory assembly lines, and computer-based manufacturing. Rudimentary computers played key roles in the design and assembly of next generation computers. Rudimentary software tools played key roles in the design and assembly of next generation software tools. And the cycles continued.

In parallel, chemists gradually grew more capable of causing compounds to react, and of synthesising new chemicals. Each new chemical could become part, not just of a new item of clothing or shelter, etc, but of yet another reactive pathway. New chemicals led to the production of yet more new chemicals.

These positive feedback cycles resulted, not only in tools with greater strength, but in tools with greater precision. Aided first by magnifying glasses, and then by wave after wave of improved microscopes and other imaging appliances, humanity understood the composition of matter on smaller and smaller scales. What’s more, by controlling the environment in ever more ingenious ways, humanity also gained the power to alter matter on smaller and smaller scales – causing molecules to combine together in ways that were not previously possible.

Some thinkers used to suppose that there was a sharp dividing line between the processes of living organisms (organic chemistry) and those of lifeless materials, such as metals and rocks (inorganic chemistry). This “vitalist” dogma was overturned in 1828 when German chemist Friedrich Wöhler demonstrated the creation of the biological compound urea from the inorganic material ammonium cyanate. Further developments led to the biochemical innovations covered in the previous chapter, such as the Haber-Bosch process that revolutionised how crops are fertilised: synthetic fertiliser could replace the fertilisers that had come from biological sources (animal and bird manure).

This chapter concerns the overturning of another dogma – the dogma that atomically precise manufacturing can only take place in biological contexts. Working inside living cells, ribosomes can assemble lengthy chains of amino acids into proteins. The vision of nanotechnology is that nanoscale devices, designed by human ingenuity, can build lots of other products with similar atomic precision. These products can include ultra-efficient solar energy arrays, materials that combine ultra-resilience with extraordinary strength, fabrics that never need to be cleaned, and swarms of nanobots that can roam in the bloodstream to identify and eliminate cancer cells.

Waves and transitions

Powerful technological progress generally needs to pass through many waves. Each wave involves its own ‘S’ curve of performance improvement: an initial slow period can tip over into a faster period, before slowing down again. Overall progress depends, not only on harvesting the potential of individual waves, but on managing the disruptive transitions between successive waves.

For a number of reasons, these disruptive transitions are often delayed, or even flunked altogether. This can happen because of lack of financial investment, an over-focus on short-term monetary returns, philosophical opposition that insists “that’s not how we do things around here”, concerns over risks of unintended consequences of a new technological platform, and social or political opposition from vested interests who perceive themselves as doing well from the status quo. All these factors have constricted progress towards the full potential of nanotechnology. It is now time to bring matters into the public understanding, so that progress can pick up pace.

One thing the public needs to understand better is the set of positive steps that have already taken place – the various components that are being put into place as a prelude to full nanotechnology.

Fabricating integrated circuits

The most dramatic progress has been in the nanoscale assembly, not of dynamic structures, but of static arrays of transistors, formed into ever more powerful integrated circuits. The products in this case involve the processing and manipulation, not of macromolecules, but of electrical or magnetic signals. The products are memory disks, sensors, actuators, and processing units, whose improvement has tended to follow roughly exponential curves since the very first silicon integrated circuit in 1959.

Numerous improvements in the underlying architecture of these circuits have seen the characteristic dimension of individual transistor elements plummet from 10,000 nm in 1971, through 1,500 nm in 1985, 250 nm in 1998, and 32 nm in 2012, down to 10 nm in 2018. As a result, the cost to store information, or transmit it, or compute with it, has shrunk and shrunk. Information, which used to be very costly in many cases, is increasingly available for free.

A full set of 32 volumes of the Encyclopaedia Britannica, comprising 32,640 pages, used to sell for $1,400. Nowadays, a much larger set of knowledge is available, for no access charge whatsoever (beyond the network connection charge), via the online Wikipedia repository. And whereas it could take years for new information to make its way into a subsequent edition of Britannica – to update an article when someone died, or a record was broken, etc – new information often appears in a Wikipedia article within minutes of the event happening.

Similarly, modern smartphones can access up-to-date maps of vast networks of roads, traffic conditions, and public transportation schedules, and can calculate (again, free of charge) the optimal route to travel from A to B. The same smartphones allow consumers to listen on demand to huge libraries of music, from every artist under the sun, for a modest monthly fee.

This growing tendency towards information being free sets the template for what can be expected from the full realisation of nanotechnology. We can look forward to more and more material goods being essentially free of charge.

3D and 4D printing

For another important set of steps towards full nanotechnology, consider progress with 3D printing, also known as “additive manufacturing”.

3D printing is the programmable building up of 3 dimensional objects, layer by layer. Different 3D printing techniques operate with metals, liquids, powders, and biological materials. Rather than needing physical adjustment for each new design it processes, a 3D printer is reconfigured by software. This makes it possible for a variety of experiments to be carried out more quickly, and for designs to be copied, edited, and shared more easily. One benefit is that this speeds up innovation, and allows alternative design ideas to be refined and adopted.

3D printing has already been involved in the fabrication of shoes, clothing, dental implants, bone implants, electronic circuitry, microbatteries, food, medicines, components of motor cars, components used in building houses, and replacements for malfunctioning biological organs. 3D printing has also been used in the creation of weapons, and in the construction of additional 3D printers.

Some of the objects produced by 3D printing are designed to subsequently change in shape or form, dependent on features of the environment, such as temperature, pressure, and humidity, as well as electric or magnetic fields. Since these objects transform over time, in preprogrammed ways, this variant of 3D printing has been called “4D printing” (time being the fourth dimension), and the resulting objects are said to be “programmable matter”.

There are many different types of 3D printing technologies, with names such as Continuous Liquid Interface Production, Digital Light Processing, Direct Metal Laser Sintering, Electron Beam Melting, Fused Deposition Modeling, Metal Binder Jetting, Metal Powder Bed Fusion, Sand Binder Jetting, and Stereolithography. Most of these processes operate with resolutions far larger than the nanoscale. However, a technique known as Two-Photon Lithography can achieve a resolution of 65 nm.

Two-Photon Lithography involves laser pulses with duration measured in femtoseconds (where there are one million femtoseconds in a nanosecond). During a femtosecond, a photon of light travels just 300 nanometres. Combinations of femtosecond laser pulses enable near atomic scale precision. This technique has been used in the creation of so-called nanosculptures. Photographs taken with an electron microscope show these sculptures as being markedly smaller than the width of a human hair or the eye of a needle. Individual parts of the sculpture, such as the finger of a model’s hand, are around 1,000 nm wide. These parts have been built up by the laser pulses working at nanoscale.

Whilst still some way removed from the concept of nanofactories, the various techniques of 3D and 4D printing point the way forwards to what can be achieved in the future.

New materials

Science can enable the manufacturing of new materials with properties that exceed those of the materials that are found naturally. This includes “nanomaterials”, where at least one dimension of the material is 100 nm or less.

A striking example of a nanomaterial is graphene, which is a single plane of carbon atoms in a hexagonal pattern. Andre Geim and Konstantin Novoselov, two Russian-born physicists working at the University of Manchester in England, won the 2010 Nobel Prize in physics for their “groundbreaking experiments” with graphene. This was the first Nobel Prize related to nanotechnology.

Graphene is just 0.3 nm thick. Despite this extreme thinness, graphene is believed to be the strongest material known so far, being 300 times stronger than steel. A special combination of two layers of graphene, known as diamene, is tougher than diamond, and can even stop a bullet. In view of several other remarkable properties, graphene has potential applications such as shatterproof smartphone screens, lower cost solar energy cells, faster recharging lithium ion batteries, ultracapacitors that could replace batteries altogether in some circumstances, improved water desalination filters, and the storage of hydrogen for vehicles that are powered by hydrogen fuel.

Nanomaterial can be used, not only as thin films or surfaces, but also as wires (“nanowires”) and tubes (“nanotubes”), in addition to other structures. A nanomaterial surface can be provided as a protective layer around fabrics, metals, wood, glass, and so on – in which case it is known as a “nanoshell”.

Whereas nanoshells have one nanoscale dimension, and nanowires have two nanoscale dimensions, an object with three nanoscale dimensions is often simply called a nanoparticle. This term is applied when the object in question has been designed or selected to carry out a specific task, by virtue of its shape and structure. Nanoparticles of titanium dioxide are used in self-cleaning windows, and in sunscreens (where its transparency makes it cosmetically appealing). Silver nanoparticles are used in anti-microbial coatings, as are particles of zinc oxide. More complicated nanoparticles can play vital roles in medicine.

Research into novel nanomaterials remains at a relatively early phase. The present time can be compared to the decades just after synthetic plastics were first invented. Since these pioneering discoveries, plastics of numerous sorts have been put to all kinds of unexpected uses – with both good and bad effects. It’s likely to be the same with nanomaterials. Adopting ideas from the world of 3D printing may turn out to be particularly fruitful. Greater use of artificial intelligence in the design of nanomaterials is likely to accelerate innovation further – as will breakthroughs in quantum computing.

Quantum computing

Of all the advances in technology at the nanoscale, those with the most dramatic consequences may be in quantum computing.

Classical computers rely on there being a clear distinction at all times between a ‘1’ and a ‘0’. Whether a low-level piece of data is recorded as a DC voltage, an miniature electric current, or the orientation of a microscopic magnet, etc, there is no “in between” state. Whenever data is changed from one value to another, it does so in distinct steps. That’s the meaning of “digital” in the phrase “digital technology”. Out of vast numbers of  individual ‘0’s and ‘1’s, magnificent data tapestries can arise – images, sounds, sculptures, videos, and more. To the human observer, these tapestries can be full of nuance and subtlety – rich in curve and contour. However, these tapestries are constructed from elementary binary digits, that is, from “bits”.

In contrast, quantum computers involve qubits, also known as “quantum bits”, which defy simple description. Colloquially, it is often said that qubits can take any value in between 0 and 1. A more accurate description is that a qubit is a combination of ‘0’ and ‘1’, sometimes written as α’0’+β’1′, or α|0>+β|1>, where the Greek letters α and β are so-called complex numbers that are measured, not on a single one-dimensional real number line, but on a two-dimensional complex plane.

Similar operations that classical computers carry out on classical bits of information – operations such as addition, negation, logical ‘and’, and logical ‘or’ – have their equivalents in the operations quantum computers carry out on quantum bits. Qubits have the additional property that two or more of them can be “entangled”. What makes all this important is that quantum computers can perform some kinds of calculations much more efficiently than classical computers.

An example where a quantum computer can out-perform any classical computer is the “prime factorisation” problem of finding two prime numbers which multiply together to give a specified result. Thus, given 56153, the problem is to find two prime numbers p and q whose product equals 56153. (The answer is 233 and 241.) Modern security systems depend upon this problem requiring huge amounts of time and computing energy to solve it, for large products (far exceeding the example of 56153), since the prime numbers involved are the keys to decrypting confidential information. Today’s quantum computers, which have less than 100 qubits each, cannot yet crack this particular problem for prime numbers of the size used in existing public key cryptography. But as quantum computers grow in scale, their calculating prowess is expected to increase much faster than is possible for classical computers. It seems, therefore, to be only a matter of time before huge amounts of information currently believed to be secret will become transparent to anyone with access to a suitable quantum computer.

The example of searching for two prime numbers is a general case of the broader problem of “unstructured search”. Using a method known as Grover’s Algorithm, quantum computers can outperform classical computers in various situations when a vast set of candidate answers needs to be searched, but where there are no hints as to whereabouts in that set to start looking, nor any feedback from near misses. In effect, different combinations of qubits simultaneously explore multitudes of different options, in different parallel “branches” of computation, and highlight a solution as soon as one is found in any of the branches.

Quantum computers will also enable an acceleration in the identification of new chemicals with desired properties. These properties can be explored in advance using “quantum simulation” processes, without needing to individually synthesise and manipulate each of these chemicals. Once suitable chemicals have been identified via this simulation process, they can be utilised in new drugs, new foods, and new materials.

There’s evidently a positive feedback loop between quantum computing and the other sectors of nanotechnology. Better nanotechnology will provide better ways of constructing quantum computers. In turn, better quantum computers will assist with the identification and configuration of new nanoparticles and other nanomaterials. An improvement curve that is initially slow and disappointing could morph relatively suddenly into a much faster phase of mutual benefit.

More investment is being applied to the field of quantum computing now than ever before. This reflects the perception that the date of “quantum supremacy” is fast approaching, when calculations on a quantum computer will be demonstrably superior to those on even the largest classical supercomputer.

Something that increases the likelihood of such a breakthrough is the fact that multiple different kinds of quantum computer are being explored at various research labs. Just as it was unclear in the late 1950s whether germanium or silicon would prove to be the most useful semiconductor, it is still unclear which of many competing quantum computing designs will prove to be the most useful. Within just a few short years, matters are likely to become more definite, with the quantum computing equivalent of “Silicon Valley” – wherever that will be – acting as the hub for rapid new innovation in multiple industries.

Web software pioneer Marc Andreessen is renowned for his perceptive saying that “software is eating the world” – meaning that, in every industry, lack of up-to-date knowledge in leading software techniques is a recipe for competitive failure. Before long, the new saying may be that “quantum software is eating the world”.

Nanomedicine

Medicine is a field that can benefit greatly from nanoscale engineering.

Consider the set of viruses which, in previous eras, caused numerous dreadful diseases. With careful alterations, these viruses can now be used to attack, not the body as a whole, but cancerous growths. For example, it is possible to alter the polio virus, removing a genetic sequence so that the virus can no longer reproduce in normal cells. However, that virus still does reproduce inside cancerous growths, where it weakens the cancer to the point that the patient’s immune system is spurred to recognise the danger and to finish the task of defeating the tumour. This technique has achieved spectacular results in the treatment of glioblastoma, a particularly nasty form of brain cancer. It’s a stunning vindication of nanoscale engineering.

A different use of specially re-engineered DNA is as a packaging vehicle for a medical payload, such as a drug that needs to be delivered to a specific target location. In this case, the technique has been called “DNA origami”, since the DNA sequence is chosen so that the molecule folds (upon heating and cooling) in a particular way. To be clear, in this technique, the chosen string of DNA serves a purpose only through its three-dimensional structure – not via any biological interactions (such as creating proteins). The eventual biological interaction is due to the payload which has been carried along inside the DNA packaging.

This example is part of a growing move towards more extensive use of “nanomachines” and “nanosurgery” in medicine. This includes rotor mechanisms and hinged molecular manipulators, both formed from interlocking components of DNA. Nanomachines can be steered via the application of small, rotating magnetic fields. Other nanomachines will use the same techniques that biological systems already deploy as hunter-killer immune cells track down their targets.

A clear sign of progress with nanomachines was the award of the Nobel Prize for Chemistry in 2016. This prize was jointly received by Fraser Stoddart from Scotland, Bernard Feringa from the Netherlands, and Jean-Pierre Sauvage from France, in recognition of their pioneering work in this field – such as finding ways to convert chemical energy into purposeful mechanical motion.

As the Nobel committee remarked, nanomachines in 2016 are at a roughly similar situation to electrical motors of the 1830s: the basic principles of the manufacture and operation of these machines are just becoming clear. The scientists in the 1830s who demonstrated a variety spinning cranks and wheels, powered by electricity, could hardly have foreseen the subsequent wide incorporation of improved motors in consumer goods such as food processors, air conditioning fans, and washing machines. Likewise, as nanomachines gain more utility, they can be expected to revolutionise manufacturing, healthcare, and the treatment of waste.

Six answers to scarcity

As the earth’s population grows, how will more and more people be able access items that contain materials that are in short supply, such as rare earth elements?

Transhumanists can point to six answers to this question.

First, where there is a genuine scarcity, items should be shared, rather than restricted to just a few owners. In this way, transhumanists support the growth of the circular economy, and the associated changes in mindset.

Second, improvements in recycling processes – including use of nanotechnology – will be able to extract rare materials from older products, enabling higher amounts of re-use in newer products.

Third, alternative designs can be devised – often taking advantage of insights from artificial intelligence – that allow readily available materials to be used in place of rarer ones. In many cases, innovative new nanomaterials might serve as better alternatives to the components presently used.

Fourth, as a consequence of better design and better manufacturing, material goods will become more robust, with self-cleaning and self-healing properties. This will extend their lifetimes, and reduce the need for rapid turnover of new products.

Fifth, the asteroid belt, mainly lying between the orbits of Mars and Jupiter, is thought to hold huge quantities of all sorts of elements. It will require a major project to mine these asteroids and transfer minerals back to the earth. However, by taking advantage of abundant solar energy, and both spacecraft and mining equipment operated via automation, the project could make good economic sense.

Sixth, the relative importance of material goods will in any case decline, as people come to spend greater amounts of their time in inner, virtual worlds.

It remains to be seen which of these six answers will turn out to be more important in practice. What is clear is that there are many options to be explored.

Risks posed by nanotechnology

The comparison between nanotechnology and plastics may raise alarm bells. The extent of the damage caused to natural ecosystems by plastic waste is only recently becoming apparent. Might widespread use of nanomaterials cause similar adverse reactions?

The problem may be compounded by the tiny size of nanoparticles. Consider the damage that has been caused by asbestos particles, whose dimensions are much larger than nanoparticles. It is conceivable that nanoparticles could penetrate deep into biological organisms and then generate reactions.

The problem may in principle be further compounded if nanomaterials contain their own rudimentary intelligence, in which case they are sometimes known as nanobots. Nanobots may be designed with one purpose in mind, but could inadvertently have different actions in a changed environment.

One more comparison to consider is with the potential adverse effects of GMOs, as discussed in the previous chapter. GMOs are the outcome of a sort of nanotechnology, namely genetic engineering. Just as GMOs need careful testing (ahead of release) and monitoring (after release), the same applies to the other products of nanotechnology. In both cases, moreover, reviewers should be raising and evaluating potential failure modes ahead of time.

Some of the risk scenarios appear to be far-fetched. This includes the notorious “grey goo” scenario, of all-consuming swarms of nanomaterials, as featured in the novel “Prey” by Michael Crichton. In this scenario, self-replicating robots transform the entire planetary biomass into raw material for more copies of themselves. However, the mere fact that a scenario appears far-fetched is insufficient reason to stop thinking about it. A scenario which is infeasible in one format could become more credible if various changes are made to it. Disaster scenarios tend to involve a combination of several trends, and more than one unanticipated outcome.

Accordingly, transhumanists emphasise the need for proactive consideration of potential downsides of innovative technology. Nanotechnology fits this same pattern. Regular reviews should take place, with the highest priority.

The evaluation of downsides needs to happen alongside evaluation of potential upsides. At present, there seems to be no special reason to call for any slowdown in developing nanotechnologies. On the contrary, there is reason for call for an acceleration. In particular, it is time to campaign for far-sighted financial support for the more radical of the possibilities of nanotechnology, including the creation of programmable nanofactories.

Beyond the profit motive

Industrial companies – small startups, or giant corporations – can be either a friend or a foe for the quest for humanity to attain the full potential of nanotechnology.

These companies frequently employ the engineers who are seeking to convert the possibilities of science into real-world products. But as in the case of the companies mentioned in the previous chapter, who are involved in biochemical innovation, these companies are guided by questions of short-term revenues, as well as longer-term research. These companies are subject to pressures which can lead them to overstate the capabilities of their current products, and to downplay the risks from these products. As a result, they can divert critical funding, away from products with more profound longer-term potential, into carrying out incremental developments with lesser real significance.

Just because a group of engineers, scientists, and entrepreneurs, use the word “nanotechnology” to describe their activities, it does not mean they are making positive contributions to the eventual creation of programmable nanofactories. Far from it.

Transhumanists need to be clear in affirming the full positive potential of science. Encouragement needs to be offered, not only for the important incremental developments that are underway, but also for work to enable the disruptive leaps to future generations of technological possibility. From the perspective of short-term financial profits, such work may appear misguided. But in order that humanity can advance towards sustainable superabundance, such work needs wide attention and support.

In short, it is time to speak up for nanotechnology – to clarify the extraordinary potential that twenty first century science has placed within our grasp, and to map out routes forward that take full account of both the risks and opportunities ahead.

Developed wisely, nanotechnology will have far-reaching consequences in many areas of human life, including (as mentioned above) in the area of human healthcare. The next chapter takes a broader look at both the opportunities and issues for radically healthier humans.

<< Previous chapter <<   =====   >> Next chapter >>

Recent Posts

Q4 update: Progress towards “Sustainable superabundance”

TAM TOC graphic 2

Over the last few months, the “abundance manifesto” book has been coming into shape.

Thanks to many useful discussions with supporters of the Transpolitica vision, the book now bears the title “Sustainable Superabundance: A universal transhumanist manifesto for the 2020s and beyond. The basic framework has evolved through many iterations.

The goal remains that the book will be short (less than 100 pages), easy to read, and contain compelling calls-to-action.

Of the twelve chapter in the book, seven are essentially complete, and the other five are at various stages of preparation.

This list contains links to copies of the chapters that are essentially complete, along with placeholders for links to the remaining chapters:

  1. Advance!
  2. Superabundance ahead
  3. Beyond technology
  4. Principles and priorities
  5. Abundant energy
  6. Abundant food
  7. Abundant materials
  8. Abundant health
  9. Abundant intelligence
  10. Abundant creativity
  11. Abundant democracy
  12. Engage?

For convenience, a more detailed table of contents for the first seven chapters is appended below.

Feedback

Supporters of Transpolitica are invited to read through any parts of this material that catch their attention.

The best way to make comments on the content is via this shared Google document.

Once the book nears publication, a number of existing websites and communities will be restructured, to more usefully coordinate positive concrete action to accelerate the advent of sustainable superabundance.

Thanks in advance for any feedback!

Detailed table of contents

  1. Advance!
    • Time for action
  2. Superabundance ahead
    • An abundance of energy
    • An abundance of food and water
    • An abundance of material goods
    • An abundance of health and longevity
    • An abundance of all-round intelligence
    • An abundance of creativity and exploration
    • An abundance of collaboration and democracy
    • Time for action
  3. Beyond technology
    • Beyond present-day politics
    • Beyond present-day democracy
    • Beyond lowest common denominator voting
    • Beyond right and left
    • Beyond the free market
    • Beyond corporate financing
    • Beyond predetermined exponentials
  4. Principles and priorities
    • Nine core principles
    • Technocracy
    • Science
    • Transhumanism
    • Religion
    • Singularity
    • Exponential urgency
    • Technological determinism
    • Techno-optimism
    • Precaution and proaction
    • Diversity and inequality
    • Diversity accelerating
    • Coexistence
    • Human-like minds
    • Re-engineering natural ecosystems
    • Beyond hubris
    • Taking back control
  5. Abundant energy
    • Anticipating climate chaos
    • Taking climate seriously
    • Technology is not enough
    • Steering short-term financials
    • A battle of ideas
    • Beyond greenwash
    • A role for nuclear energy
    • A role for geoengineering
    • A wider view of environmental issues
  6. Abundant food
    • Population, onward and upward?
    • The legacy of Malthus
    • Necessity and innovation
    • In praise of biochemical innovation
    • More waves of innovation ahead
    • Towards feeding one hundred billion people
    • Risks posed by biochemical innovation
    • The move from harm to ruin
    • Rapid response
    • Beyond the profit motive
  7. Abundant materials
    • Approaching nanotechnology
    • Tools that improve tools
    • Waves and transitions
    • The fabrication of integrated circuits
    • 3D and 4D printing
    • New materials
    • Quantum computing
    • Nanomedicine
    • Six answers to scarcity
    • Risks posed by nanotechnology
    • Beyond the profit motive

 

  1. Q3 sprint: launch the Abundance Manifesto Leave a reply
  2. Q2 sprint: Political responses to technological unemployment Leave a reply
  3. Tools for better politics? 2 Replies
  4. Chapter updated: “1. Vision and roadmap” Leave a reply
  5. Chapter updated: “4. Work and purpose” Leave a reply
  6. Transpolitica goals and progress, Q1 Leave a reply
  7. The Future of Politics (#T4G17) Leave a reply
  8. Democracy and inclusion: chapter ready for review 2 Replies
  9. Markets and fundamentalists: chapter ready for review Leave a reply