Discussing the singularityFebruary 13, 2006 11:33 PM   Subscribe

I'm reading Ray Kurzweil's "The Singularity is Near" and would like to know more about the discussion around this book and his ideas. What do people who work with these things – biotech or the computer research – think of him? Is Kurzweil stealing the limelight from other, perhaps lesser known, but more interesting/influential people? Are there any critics I should be aware of (I've found Theodore Modis)?

Disclaimer: I ask because I'm interested in these things (earlier question here), but also because I'm currently writing an article about Kurzweil. If this is an improper use of AskMe, please delete my question.
posted by Termite to Technology (19 answers total) 4 users marked this as a favorite

Have you looked at Vernor Vinge? He has written some essays on the idea (it's the first place I ran across it), explaining how it constrains what he considers to be plausible SF.

Vinge also has a stronger definition of singularity than many other people (including, AFAICT, Kurzeweil, although I've basically ignored his book). Continued exponential growth is not sufficient. Many people use "singularity" to refer to a time in which (post-)human life is incomprehensible to us today. Many people use it in an even weaker sense, using it as shorthand for some specific technologies (mind uploading, eg). The stronger, and to my mind more interesting and less rigorously defensible, definition is the one Vinge uses:

Consider your life today, and think back to when it would have been (nearly?) incomprehensible to a person of a previous time. Let's say that's N years ago. Now imagine the life of someone 100, or 1000, years ago, and imagine them making the same estimation, and coming up with their own values of N. So for any given time in history you can produce a value for N. Now, if you think that N is decreasing over time, and if you think that it will eventually reach zero (instead of finding a horizontal asymptote), then you're looking at a Vingean singularity: the rate of change of human experience becomes infinite, not just extremely large, and any activity at or after that time is utterly incomprehensible not just to us (that's pretty much a foregone assumption) but also to the people caught up in it.

Compared to this, the "singularities" talked about by most other writers I've encountered are pale reflections, minor steps on a much stranger road. But, like I said, more plausible: almost every sentence in my previous paragraph has some seriously unproven assumptions in it.
posted by hattifattener at 11:59 PM on February 13, 2006

My cheif complaint about Kurzweil is that his basic argument boils down to: "Wouldn't if be cool if...". He BEGINS with the idea that exponential growth of complexity is going to continue until soon superintelligent nanobots will rule the universe. This stuff makes great science fiction, but he doesn't lay down a lot facts to back it up. There are limits to growth. There are problems with strong AI that doubling the computing power does nothing to alleviate. Exponential growth is a fascinating and extremely relevant fact of life, and as I see it there are a number of possibilities. One is that we hit a brick wall, run out of resources, and society collapses upon itself. After which we achieve stability and rebound, maintain equilibrium, or die out altogether. The other possibility is that we achieve the singularity and go on to continue exponential growth. Kurzweil gives me no real reason to believe his version over any of the others. Remember, the dinosaurs had an exponential growth curve before they went extinct too.

Criticism/Discussion:

Kurzwiel actually maintains a formal point/counterpoint on his website.

Another good place to start on criticism is Amazon's reviews, ordered from lowest to highest.

An review/criticism article from tcs magazine.

Other books in the same vein are Stepen Wolfram's "A New Kind of Science", George Gilder's "Telocosm", or Kevin Kelley's "Out of Control".
posted by sophist at 12:33 AM on February 14, 2006 [1 favorite]

My biggest problem with both Kurzweil (and the Vinge idea above abuses it even more) is the fetishism of exponential growth. The reality is that almost nothing will grow at an exponential rate forever -- we live in a universe of scarcity, and such a universe does not favor unlimited growth.

But I find it hard to systematically critique Kurzweil, because all of his ideas are just so damn speculative. And the end of the day, it's not a lot more than "I think Kurzweil is full of it."

I heard all of the buzz around Kurzweil, read him expecting a lot, and at the end of the day decided that his reputation owes to one thing: his predictions line up with the fantasies of sci-fi geeks (and such types); as such, it is very powerful mythology. I'm not sure it's much more than that. I was very disappointed.
posted by teece at 12:43 AM on February 14, 2006

Alternatives to Kurzweil's singularity:
> Fundies beat the scientists and stop it all
> Wholesale madness caused by too-rapid change causes us to abandon technology (fundies?)
> Resources fail before we find a work-around to failing resources.
> Jesus returns and stops the progress renders it moot.

Other than that, I don't see any reason to think this thing won't happen. I worry about who will control the technology, and what their motives may be. I don't want to see a world run by the heirs of Mao nor of Bushco.
posted by Goofyy at 2:32 AM on February 14, 2006

The Scottish SF writer calls Singularity the "Rapture for Nerds". His critique seems a little flip until you realise that he is attacking the problem from a position of committed anarcho-syndicalism and a realisation of the limits to growth.

Many of the fanciful curves in books like these are based on an accelerating and extraordinary exploitation of easy fossilised carbon sources in our immediate environment. This access to cheap energy has fueled a productivity boom over the last few generations that is spectacular, but unsustainable. It has enabled developed cultures to decentralise their urban centres, to move away from a reliance on animal and human labour, and to transport commodities and foodstuffs many thousands of km.

It won't last. History is full of examples of cultures that expanded rapidly for a century or two and, during that time, doubtless many pundits predicted that within a few years "everyone" would be similarly uplifted. These times never last. Either resources run out, political will fails, or cultures erupt into or through transformative social upheavals that bring chaos and retrenchment.

Similar massive bursts of growth were experienced in the past by emerging cultures that happened upon or conquered new territories. The recent rapid expansion of caucasians in the Americas during the 17/18/early 19th centuries comes to mind. There is no parallel in human history for such a rapid expansion of a single ethnic group at the expense of others. That expansion was curtailed by the outbreak of the crowd diseases and pandemics once cities had grown large enough.
posted by meehawl at 5:31 AM on February 14, 2006

(I'd love to hear what cstross has to say about this.)
posted by ereshkigal45 at 6:29 AM on February 14, 2006

There certainly are some strong critiques from a Diffusion of Innovations perspective. Singularity theories seem to obsess over the idea that technology changes culture, and completely ignore the evidence that culture has a strong influence on which technologies actually get adopted in day to day life. The basic lesson is that "nifty" is a poor predictor of mass market adoption, and changes in culture are bound to human time frames.

Vinge especially strikes me as problematic on this in that he proposes multiple scenarios for a singularity with plenty of citations, but does not reference any of the copious lit. dealing with the actual life of technology in society. It is one thing to claim that we are experiencing dramatic increases in technological innovation, it is another thing to claim Moore's law as the basis for a cultural revolution unparalleled since the neolithic.
posted by KirkJobSluder at 6:46 AM on February 14, 2006

You might also try Ellen Ullman. From an interview with her:

Ellen Ullman: The reason Ray Kurzweil gets away with it is because he does this funny thing where he defines the human as a computer and then says it's easy to program. He mistakes the metaphors that we use for the computer for the thing itself. For instance, he calls the brain a "file" which could be downloaded. He says the self can be "ported" to another substrata, that is silicone, and he defines thinking as "information sampling". If you define a human being as a computer, it then seems very easy to program a human being.

Ian Walker: Is Ray Kurtzweil's problem that he's over-extended the metaphor, or he's actually lost in the idea, lost in his own sort of hubris I guess, or in the science?

Ellen Ullman: Computer scientists, all of us, have a way of getting crazy. You take an idea and you travel so deep into that idea, it seems to make sense, it seems to be an explanation for everything.

I agree with what the others have said above, too.
posted by vacapinta at 7:53 AM on February 14, 2006

A real problem with singularity theories is that they generally take for granted that humanity is evolving to a social state wherein religious zeal and national and cultural exceptionalism are gone, or at least completely marginalized.

Given the rather stark correllation between low birth rates and low degrees of religious/cultural/nationalistic zeal, one suspects that this is more wish fulfillment than any sort of sound projection.

Interestingly, this same dubious presumption also informs some of the strongest criticisms of singularity futurisms, namely that they won't be realized due to looming resource constraints. Religious, nationalist, and cultural exceptionalism provide the spine for the wars and other assertive acts which concentrate scarce resources sufficiently enough for the victors to continue to advance science and technology even in the face of severe global shortages. In other words, as long as the U.S. is willing to do what it takes to keep open the doors of its research universities, the price to drive a mile or buy a loaf of bread somewhere else in the world won't slow down innovation.
posted by MattD at 8:21 AM on February 14, 2006

Kurzweil and Terry Grossman are also building a lucrative vitamin and mineral supplement and "longevity consulting" business. Relentless boosterism is good for sales. After all, if Singularity is "just" 40 years away, wouldn't you spend just that little bit more on some coloured pills and "treatments" that will help you get there in a well-preserved condition, ready for your upload?
posted by meehawl at 8:48 AM on February 14, 2006

I haven't read the entire book yet. However, I can't help but think that some of you are mistaking the theory of singularity with someone's ideas (maybe Kurzweil's) of what it will look like. These are separate.

Any of us can speculate on the what things will be like before/at/after singularity. Why not? Sounds like the basis of some good fiction, to me.

Someone above talked about depletion of fossil fuels, as if the future depended entirely on that. That's rediculous. Of course there is a solution to that issue. Relatively trivial, I suspect.
posted by Goofyy at 11:30 AM on February 14, 2006

There is a solution to that issue. Relatively trivial, I suspec

Oh do tell.
posted by meehawl at 11:41 AM on February 14, 2006

People you need to read more of (thank Kurzweil, who's a Johnny-Come-Lately popularizer): Hans Moravec (particularly his early book "Mind Children", if you can find a copy), Vernor Vinge (obviously -- took the term into common currency in the SF field), Eric Drexler ("Unbounding the Future" if you want the popsci version, "Nanosystems" if you want the PhD thesis), Wil McCarthy ("Hacking Matter" -- 1001 uses for a quantum dot), and any papers you can find by Nick Bostrom, Seth Lloyd, or Robin Hanson.

Hmm. Now, on the contrarian side of the balance, let's add Frank Tipler (before he went barking round the bend in "The Physics of Immortality"), Roger Penrose (and don't come back until you can come up with at least two refutations of his core thesis in "The Emperor's New Mind"), and, oh, google on "Extropian". Hmm. Ed Regis' "Great Mambo Chicken and the Posthuman Condition" was probably the first takedown of this meme cluster to make it into print, back before Kurzweil even latched onto the bandwagon.

(And you can read my novel for free if you want, although it would make me (and my editors) happy if you'd buy the dead tree edition.)
posted by cstross at 2:54 PM on February 14, 2006 [5 favorites]

s/(thank Kurzweil/than Kurzweil/

Sorry (Firefox crashed while posting).
posted by cstross at 2:56 PM on February 14, 2006

Edge has a manifesto by Kurzweil and plenty of rebuttals.
posted by fuzz at 3:37 PM on February 14, 2006

cstross: Well yes, I've read McCarthy, Vinge and Drexler.

I've also read Rogers, Brown and Duguid and Vygotski, as well as a fair bit of basic discussions of biochemistry. So when these people make claims about technology driving social systems, they sound a bit naive to me.
posted by KirkJobSluder at 5:34 PM on February 14, 2006

Technology driving social systems? What a far-fetched concept...except, um, this thing called 'internet' certainly has changed the social landscape. Barring assy government creating barriers, this can only increase.

I would say that air travel has also changed the social environment. Combined with internet, the globe has gotten considerably smaller. When we get good speech recognition combined with instant translation, that will increase, too.

Energy issues:
I said I suspect that to be 'trivial'. That means I don't think it requires any major scientific breakthroughs. Mostly just application of existing technology, and the will (and ability!) to make the investment.

We waste energy everywhere. Some of that is bound to be salvable. Consider everything in your home that plugs into the electric grid. Most of that stuff begins by an inefficient voltage reduction.
posted by Goofyy at 8:24 PM on February 14, 2006

on the contrarian side of the balance

Charlie, you forgot Roszak's The Cult of Information.

Mostly just application of existing technology, and the will (and ability!) to make the investment.

Net primary productivity. NPP. Look it up. When you get down to basics, we are hetreotrophs, dependent on green plants to fix solar energy using simple chemistry. We already consume several hundred times the annual NPP every year by burning through the stored chemical energy of past eras. Hydrogen is a sink, not a source.
posted by meehawl at 4:54 AM on February 15, 2006 [1 favorite]

Goofyy: Technology driving social systems? What a far-fetched concept...except, um, this thing called 'internet' certainly has changed the social landscape. Barring assy government creating barriers, this can only increase.

You know, for 20 years people have been saying that, while other people who actually look at what happens on the internet find that it just serves as a new site for the same old power dynamics and human interactions. Certainly the internet has changed the social landscape, but the social landscape has also changed the internet as well. Witness for example, the gradual decline of anarchic usenet and mailing lists in favor of message boards supported by a 200 year old revenue model. Most of what happens on the internet is analogous to older media systems.
posted by KirkJobSluder at 9:50 AM on February 15, 2006 [1 favorite]

« Older v-day wannabe sweetheart: go big? (or forget with...   |   Request for waterworks manual Newer »
This thread is closed to new comments.