The Singularity and the "Prevail Scenario"

by phil on Friday Jun 17, 2005 1:40 AM

UPDATE: I've cross-posted this on where there is a larger discussion taking place. Please post your comments there. Also, I found a deeper manifesto by Lanier wherein he argues against "cybernetic totalism."

I went to a talk by Joel Garreau who just published the book Radical Evolution. The subtitle of the book is "The Promise and Peril of Enhancing Our Minds, Our Bodies" And What It Means To Be Human." The talk and the book are about the radical changes to come amidst a world of limitless technology.

I normally avoid these talks because I have—so I've thought—internalized the interesting perspectives on where the Singularity will take us. Turns out I just only have two extremist views. There's Ray Kurzweil who, in The Age of Spiritual Machines, describes a "Heaven" scenario for mankind, wherein we upload our minds to machines and simulate a paradise of infinite beauty. Then there's Bill Joy who asks: In a world where a million people can make an atom bomb, how do we stop ourselves from self-annihilation? (cf: Why the future doesn't need us). We can call his the "Hell scenario."

Garreau introduces an alternative view titled the "Prevail Scenario," which he ascribes to Jaron Lanier.

The rest of this post is about the Prevail Scenario, pulling quotes from Chapter 6 of Garreau's book.

In both the Heaven and Hell Scenarios, the embedded assumption is that human destiny can be projected reliably if you apply enough logic, rationality and empiricism to the project.

This is referring to Moore's Law and its extrapolations which see chip speed and technological progress as following a smooth, exponential curve. It is practically an article of faith among technologists that the computing power of the brain will fit on a chip the size of a penny within a few decades. However, Kurzweil and Joy are obsessed with this piece of data, according to Lanier.

In The Prevail Scenario, by contrast, the embedded assumption is that even if a smooth curve does describe the future of technology, it is not likely to describe the real world of human fortune. The analogy is to the utter failure of the straight-line projections of Malthusians, who believed industrial development would lead to starvation, when in fact the problem turned out to be obesity.

Another Singularity-like exponential curve seems fishy upon a modest glance of history. One could say that there has been an exponential curve in warfare technology, starting with the invention of the phalanx by the Ancient Greeks moving on to guns during the Napoleonic Wars. After World War I, it seemed that warfare would come close to world annihilation. And a couple decades later, with the atom bomb dropping, fatalists would think that it was only a matter of years before nuclear winter would destroy humans. Sixty years later, we have prevailed. So while there has been an exponential development in warfare, a Singularity of human annihilation hasn't happened as would have been predicted.

The Prevail Scenario is essentially driven by a faith in human cussedness. It is based on a hunch that you can count on humans to throw The Curve a curve.

The Prevail Scenario is actually not a single scenario, but a plurality of scenarios that see technology's impact on humanity not as an exponential curve that leads to a vertical line of progress, but rather as a spaghetti of outcomes that is as rich and unpredictable as human history has been.

Lanier espouses a particular instance of The Prevail Scenario which focuses on human connectedness. In this perspective, technology's best contribution is in bringing humans closer together. To him, it is "the quantity, quality, variety and complexity of ways in which humans can connect to each other" that constitute the relevant Curve.

Garreau also provides a list of "warning signs" why the Heaven and Hell Scenarios seem unlikely:

- Resistance to The Curves of change is actually having an effect worldwide.

- Certain technologies that affect human development and enhancement are globally seen as worth slowing down or stopping, in the way that the use of nuclear weapons was effectively prevented for the second half of the 20th century.

- Technologies that were seen as inevitable turn out to take much longer to develop than anticipated. Predictions common in the early 21st century begin to sound as silly as those of the middle of the 20th century, such as the paperless office, hotels on Mars and self-cleaning houses.

- Researchers voluntarily stop working on topics they view as too dangerous.

- Researchers decline funding for certain topics that they view as too fraught with human peril, putting their ethics ahead of their promotions, tenure, graduate students and intellectual curiosity.

- Researchers decline funding from organizations they view as too laden with problems, such as corporations and the military.

- Computational power is no longer seen as achieving exponential growth because of the inability of software to keep up the pace of innovation.

- There is little correlation between any exponential change in technology and the development of human society.

To close, I'll end with a nice refutation of a nanotech "Hell Scenario:"

He [Lanier] completely believes that the moment nanobots are poised to eat humanity, for example, they will be felled by a Windows crash. "I'm serious about that--no joke," he says.

A few notes about the talk itself:

The talk was held at the SAP forum in Palo Alto and put together by the Bay Area Future Salon. The audience was comprised of about fifty people, most above the age of thirty. The crowd was well-versed in futurism topics, such as Kurzweil's Law of Accelerating Returns. My guess is that Garreau took the time to speak here because this small group contains lighting rods for his kind of message. Garreau's book came out last month, so perhaps this is also part of some book tour. While the talk was simple, it had cogent details and an engaging narrative.

Creative Commons License