Ten essential observations

Ten essential observations

The Singularity Principles arise from a number of general observations. These observations make it essential that we improve our collective abilities to anticipate and manage disruptive changes, before an acceleration and intensification of such changes undermine the conditions for human flourishing.

Tech breakthroughs are unpredictable (both timing and impact)

History teaches us that breakthroughs in technological capability may be dramatic and unexpected. That’s both in terms of timing (such as an explosive breakthrough following a long period of disappointingly slow progress) and in terms of impact (with some breakthroughs being more widely applicable than was previously anticipated).

Some critics assert that, on the contrary, we can be sure that technological capability is reaching a plateau. They claim that all the low-hanging fruit have been picked.

To counter this counter, observe that:

  • There is no fixed scientific barrier that would prevent further improvements in, for example, nanotech, biotech, infotech, or cognotech. As Nobel Physics laureate Richard Feynman once said, “There is plenty of room at the bottom”
  • More engineers and entrepreneurs than ever before have been trained in a rich variety of methods to develop new technologies, including methods both for incremental improvement and for the creation of disruptive new platforms
  • The tools and resources available to help develop new technology have unprecedented capability.

Potential complex interactions make prediction even harder

Surprise interactions between multiple developments in different fields can make outcomes even harder to predict.

These parallel changes include apparently unrelated technological transitions, disruptive breakthroughs as well as incremental progress, and changes in tooling, in prevailing open standards, and the quality of training data.

These overlapping changes also include updates in legal systems, popular culture, and general philosophical zeitgeist. That takes us to the next observation.

Changes in human attributes complicate tech changes

In anticipating future scenarios, it’s a mistake to treat human institutions, human attitudes, and human intentions as fixed and unchangeable. Changes in all these aspects of the human outlook can be part of a complex network of responses to technological risks and opportunities.

It is as renowned management consultant Peter Drucker observed, “the major questions regarding technology are not technical but human questions”.

For example, education and training can smooth the path of swift transition and safe adoption. Carefully targeted government subsidies can play a role too. The storylines in popular Netflix dramas can trigger positive changes in attitude in the general public toward products featuring helpful new capabilities. And so on.

On the other hand, changes in human institutions, attitudes, and intentions can also make matters worse. Unhelpful new trade barriers can hinder the adoption of safer technologies. Clumsy changes in regulations can lead to the faster spread of dangerous technologies. Public sentiment can be transformed by fast-spreading misinformation. And there can be an escalating copycat response to publicity stunts from media stars that irresponsibly endorse – or oppose – specific new products.

Accordingly, forecasts of the impacts of technologies – whether beneficial or destructive – will likely be misleading unless they consider the two-way interactions between human

Greater tech power enables more devastating results

The more powerful technology becomes, the more devastating are the results it can produce – including devastatingly good results and devastatingly bad results.

Some examples:

  • Small fireworks with errant trajectories can, in some cases, ignite a conflagration that causes widespread damage. But larger explosives, such as nuclear bombs, can destroy an entire city in the blink of an eye.
  • Documents individually hand-copied by scribes spread ideas gradually for centuries, but Gutenberg’s printing presses placed books and pamphlets into many more hands, accelerating the Renaissance and the Reformation, and triggering decades of turmoil all over Europe.
  • The flow of disinformation has caused problems throughout human history, but with modern online social networks spanning billions of users, disinformation nowadays travels at the speed of light, and has sparked near genocide.

Different perspectives assess “good” vs. “bad” differently

Results that are evaluated as “good” from one perspective, such as an increase in profits or in market share, or breaking records for speed or performance, can also be evaluated as “bad” from other perspectives – for example when externalities are included in calculations, or when a broader view of human flourishing is considered.

Competition can be hazardous as well as beneficial

Although a competitive marketplace can often accelerate positive progress, with companies racing to discover and apply useful new innovations, such competition can also result in dangerous corner-cutting or other reckless risk-taking. Hostile arms races are particularly hazardous.

Indeed, a strong competitive environment makes forecasting the future all the more difficult:

  • If two or more competitors perceive that a decisive gain will be attained by the first group to develop some new technology, they will work harder to win that race
  • If two or more competitors perceive that other groups are striving hard to gain a key advantage, they will be inclined to redouble their own efforts
  • In a fiercely competitive environment, groups will be inclined to keep some of their interim progress a secret, and also to spread FUD (fear, uncertainty, and doubt) to distract observers from a clear understanding of what is actually happening.

Some tech failures would be too drastic to allow recovery

Although there are many technological failures from which it’s possible to recover, with people (hopefully) growing wiser as a result, some technological failures may have such a vast scale that subsequent recovery would be extremely hard or even impossible. In these cases, there’s no hope for “failing forward” or “failing smart”. The only option in such cases is to avoid failures in the first place.

The more powerful the underlying technology, the more attention needs to be paid to such possibilities.

A history of good results is no guarantee of future success

The mere fact that a piece of technology has delivered a string of good results in the past does not guarantee, by itself, that the technology in question will deliver good results in altered circumstances in the future. Previous methods may fail, for unexpected reasons, when parts of the overall system are different from in the past.

The management of technological change therefore needs to rely on more than what philosophers call “induction”, that is, the assumption (implicit or explicit) that the future will continue to resemble the past.

It’s insufficient to rely on good intentions

Something else that it’s insufficient to rely on is the perceived good intentions of individuals or companies – intentions that these individuals or companies will avoid any very bad outcomes. Alas, when good intentions are coupled with a mistaken or incomplete understanding of an issue, they can result in the very sort of bad outcomes they were seeking to avoid. Moreover, these good intentions can sometimes lose their strength, being submerged under other forces that are more powerful.

Wishful thinking predisposes blindness to problems

Due to wishful thinking, providers of potential new technological solutions are often inclined to turn a blind eye to problematic features that may arise. If they hear reports of adverse side-effects or possible unintended consequences, they are motivated to disbelieve these reports, or to distort them or throw doubt on them.

They won’t just seek to deceive the general public. They’ll even seek to deceive themselves, in order to sound and appear more convincing when they issue their denials.

American writer Upton Sinclair said it well in 1935: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”

That’s another reason why greater vigilance is needed, and why openness and transparency should be rewarded.

Taken together, the above ten observations underscore the need to go beyond mere wishful thinking.

Next, let’s explore more concretely the set of risks and benefits that may arise from fast-changing technologies.


Note: The following video from the Vital Syllabus contains a visual illustration of these ten essential observations. (The description of these observations has evolved since the video was originally recorded.)

Recent Posts

RAFT 2035 – a new initiative for a new decade

The need for a better politics is more pressing than ever.

Since its formation, Transpolitica has run a number of different projects aimed at building momentum behind a technoprogressive vision for a better politics. For a new decade, it’s time to take a different approach, to build on previous initiatives.

The planned new vehicle has the name “RAFT 2035”.

RAFT is an acronym:

  • Roadmap (‘R’) – not just a lofty aspiration, but specific steps and interim targets
  • towards Abundance (‘A’) for all – beyond a world of scarcity and conflict
  • enabling Flourishing (‘F’) as never before – with life containing not just possessions, but enriched experiences, creativity, and meaning
  • via Transcendence (‘T’) – since we won’t be able to make progress by staying as we are.

RAFT is also a metaphor. Here’s a copy of the explanation:

When turbulent waters are bearing down fast, it’s very helpful to have a sturdy raft at hand.

The fifteen years from 2020 to 2035 could be the most turbulent of human history. Revolutions are gathering pace in four overlapping fields of technology: nanotech, biotech, infotech, and cognotech, or NBIC for short. In combination, these NBIC revolutions offer enormous new possibilities – enormous opportunities and enormous risks:…

Rapid technological change tends to provoke a turbulent social reaction. Old certainties fade. New winners arrive on the scene, flaunting their power, and upturning previous networks of relationships. Within the general public, a sense of alienation and disruption mingles with a sense of profound possibility. Fear and hope jostle each other. Whilst some social metrics indicate major progress, others indicate major setbacks. The claim “You’ve never had it so good” coexists with the counterclaim “It’s going to be worse than ever”. To add to the bewilderment, there seems to be lots of evidence confirming both views.

The greater the pace of change, the more intense the dislocation. Due to the increased scale, speed, and global nature of the ongoing NBIC revolutions, the disruptions that followed in the wake of previous industrial revolutions – seismic though they were – are likely to be dwarfed in comparison to what lies ahead.

Turbulent times require a space for shelter and reflection, clear navigational vision despite the mists of uncertainty, and a powerful engine for us to pursue our own direction, rather than just being carried along by forces outside our control. In short, turbulent times require a powerful “raft” – a roadmap to a future in which the extraordinary powers latent in NBIC technologies are used to raise humanity to new levels of flourishing, rather than driving us over some dreadful precipice.

The words just quoted come from the opening page of a short book that is envisioned to be published in January 2020. The chapters of this book are reworked versions of the scripts used in the recent “Technoprogressive roadmap” series of videos.

Over the next couple of weeks, all the chapters of this proposed book will be made available for review and comment:

  • As pages on the Transpolitica website, starting here
  • As shared Google documents, starting here, where comments and suggestions are welcome.

RAFT Cover 21

All being well, RAFT 2035 will also become a conference, held sometime around the middle of 2020.

You may note that, in that way that RAFT 2035 is presented to the world,

  • The word “transhumanist” has moved into the background – since that word tends to provoke many hostile reactions
  • The word “technoprogressive” also takes a backseat – since, again, that word has negative connotations in at least some circles.

If you like the basic idea of what’s being proposed, here’s how you can help:

  • Read some of the content that is already available, and provide comments
    • If you notice something that seems mistaken, or difficult to understand
    • If you think there is a gap that should be addressed
    • If you think there’s a better way to express something.

Thanks in anticipation!

  1. A reliability index for politicians? 2 Replies
  2. Technoprogressive Roadmap conf call Leave a reply
  3. Transpolitica and the TPUK Leave a reply
  4. There’s more to democracy than voting Leave a reply
  5. Superdemocracy: issues and opportunities Leave a reply
  6. New complete book awaiting reader reviews Leave a reply
  7. Q4 update: Progress towards “Sustainable superabundance” Leave a reply
  8. Q3 sprint: launch the Abundance Manifesto Leave a reply
  9. Q2 sprint: Political responses to technological unemployment Leave a reply