The Singularity is Nonsense

Technological Singularity

It's come to my attention that some futurists are still referring to future rapid developments in technology as a "singularity" or a "technological singularity".

History

Use of the term "the singularity" in this context appears to have originated with the science fiction writer, Vernor Vinge, in 1993.

See here for details.

Vinge describes the singularity as follows:

It is a point where our old models must be discarded and a new reality rules.

- http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html

Problems

The problem with using the term "the singularity" is that the phenomena in question doesn't look very singular. We are certainly in the middle of a period of exponential growth - and the rate of progress shows little sign of slowing its rate of increase.

However, exponential curves simply don't have "singular" points on them. Look at an exponential curve at any point, and it appears self-similar - no part of it is much different from any other point.

The idea of a singularity seems to suggest a sort of super-exponential growth.

This diagram - from Ray Kurzweil - illustrates the idea:


http://www.kurzweilai.net/articles/images/chart01.jpg

...or at least it would do - if it were not purporting to measuring the "mass use of inventions" in millimeters.

The suggestion seems to be that growth will get faster and faster - asymptotically approaching infinity at some particular future point in time.

If that was ever to happen, the term "singularity" would be certainly be quite appropriate.

However, the idea is a ridiculous one. On closer examination, practically no futurists actually support it.

Instead they typically claim that the term "singularity" has different inspirations:

In futures studies, a technological singularity represents a hypothetical "event horizon" in the predictability of human technological development. Past this event horizon, following the creation of strong artificial intelligence or the amplification of human intelligence, existing models of the future cease to give reliable or accurate answers.
[...]
Vinge's singularity is commonly misunderstood to mean technological progress will rise to infinity, as happens in a mathematical singularity. Actually, the term was chosen as a metaphor from physics rather than mathematics: as one approaches the Singularity, models of the future become less reliable, just as conventional models of physics break down as one approaches a gravitational singularity.

- http://en.wikipedia.org/wiki/Technological_singularity

[...] just as our model of physics breaks down when it tries to model the singularity at the center of a black hole, our model of the world breaks down when it tries to model a future that contains entities smarter than human.

- http://www.singinst.org/overview/whatisthesingularity

I note that these descriptions can't seem to make up their mind whether they are talking about an event horizon, that cannot be seen beyond, or a singularity where things break down.

Unfortunately, Vinge never defined his terminology - resulting in multiple interpretations.

Nick Bostrom has noted:

"The singularity" has been taken to mean different things by different authors, and sometimes by the same author on different occasions. There are at least three clearly distinct theoretical entities that might be refered to by this term:
  • A point in time at which the speed of technological development becomes extremely great. (Verticality)
  • The creation of superhuman artificial intelligence. (Superintelligence)
  • A point in time beyond which we can predict nothing, except maybe what we can deduce directly from physics. (Unpredictability, aka "prediction horizon")

- Nick Bostrom

Using the term "singularity" looks to me like an appalling mistake, however you look at it. The connotations of either something becoming infinite, or only happening once - are far too strong. The term immediately conjours up an innacurate and misleading impression.

As for the claims that the ability to predict the future is limited - that is caused by a well-known phenomenon known as chaos. Small uncertainties in initial conditions become magnified as time passes into large uncertainties in the outcome.

The phenomenon applies on a large range of timescales: some things are unpredictable over a few seconds - others are highly predictable over billions of years.

The breakdown of prediction does not happen at a particular point in the future - rather different phenomena are predictable on different timescales.

Our ability to predict the future of human evolution in much detail may well be limited - as predicted events become increasingly uncertain the further into the future projections are made.

However, that has always been the case - and no doubt it will always be the case. There will always be difficulties in looking very far into the future - since some elements of what will happen are contingent on chance events.

The shape of things to come

Exponential growth looks set to continue for some time to come, maybe for a very long time - in which case the future is likely to contain fantastic marvels - just as many futurists claim.

There may well be noteworthy points along the path. The origin of life was noteworthy. The invention of photosynthesis is a similar landmark. There will come a point when the dominant lifeform won't be able to interbreed with 21st centuary human beings, or that future organisms will appear incomprehensible to those alive today.

However, nobody is ever going to look back and say that the technological singularity occurred on such-and-such a date. The idea is a pretty ridiculous one.

It's very likely in the future that - as Kurzweil says - "the pace of technological change will be so rapid [...] that human life will be irreversibly transformed". However, that's not much of a prediction: it's already happened. Most likely, it's going to happen again, and is probably going to keep happening for quite a while. That's not a "singularity" - that's growth - and I don't see anything very "singular" about it.

Takeoff

What about the idea that a point will be reached when machines exceed human intelligence that causes technological development to suddenly "take off" and accelerate rapidly - as the smart machines design ever smarter machines, in an iterative cycle?

There is no such thing as a single "human intelligence" - humans have a wide range of scores on intelligence tests: machines will consequently overtake humans gradually.

Furthermore, machine intelligence is not qualitatively the same as human intelligence. Machines are good at different sets of things from humans. They already greatly exceed human capabilities in some areas - while greatly lagging behind them in other ones. This effect further blurs the point where machine intelligence surpasses that of humans.

Lastly, machines are already heavily involved in the design of other machines.

This point does not seem to be widely understood - so there's a whole separate essay about it, entitled: The Intelligence Explosion Is Happening Now.

The serious doubt over the hypothesis that machine intelligence will suddenly "take off" at some point in the future suggests it may be better to find another name for the coming events surrounding the emergence of superintelligence, that is less controversial.

Popularity

In Radical Evolution, Joel Garreau writes:

Today, all serious discussions regarding the social impact of the coming decades of the Curve start with Vinge's notion of the singularity.

This is not true - but it gives some indicatation of the extent to which lay people have been brainwashed by use of the "singularity" terminiology. It is certainly true that many people use this terminology unthinkingly, because, well, that's what it's called, isn't it?

Replacement

Since this essay argues that the "singularity" terminiology is an embarrassment, what should replace it?

I. J. Good's original terminology is far superior.

Good argued that machines surpassing human intellect should be capable of recursively augmenting their own mental abilities until they vastly exceed those of their creators in 1965. He referred to the phenomenon as an "intelligence explosion":

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind [...]. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.

- I. J. Good, 1965.

However, as I argue in my essay on the intelligence explosion - this explosion is really just a small part of the existing ongoing technology explosion.

Explosions start suddenly and diminish over time. Doesn't the current growth in intelligence accelerate gradually?

Actually, explosions start gradually with an exponential growth process, and only after an extended period do they gradually peter out. The metaphor of an explosion looks pretty appropriate to me.

Dawkins also uses the metaphor of an explosion for the phenomenon - in the chapter of River out of Eden entitled "The Replication Bomb", saying that - while some stars may "go supernova" - stars harbouring living systems might instead "go information".

We humans are an extremely important manifestation of the replication bomb, because it is through us - through our brains, our symbolic culture and our technology - that the explosion may proceed to the next stage and reverberate through deep space.

- River Out Of Eden. Chapter 5 - The Replication Bomb (Dawkins - 1995)

Conclusion

The "singularity" terminology is ambiguous. The impressions it conveys most strongly are highly inaccurate. The terminology makes the futurists that use it look incompetent. Its widespread use gives futurism a bad name.

Saying that the technological singularity will occur in 20xx - just because some unique technological events will happen then - is about as silly as saying that the sociological singularity occurred in the 1960s.

There's no such thing as "the technological singularity" - the whole idea is ridiculous.

It would be best for everyone involved to ditch this ill-conceived terminology as soon a possible.

Postscript

Some of the views in this essay have subsequently been echoed by Kevin Kelley - in his own essay:

[The Singularity Is Always Near]

The point about the inherently non-singular nature of exponential functions has been made previously here:

[The Singularity by Lyle Burkhead - in the section "Exponential functions don't have singularities!"]

References

  • The Intelligence Explosion Is Happening Now - Tim Tyler


  • Tim Tyler | Contact