During my bored surfing I found a link for the webpage of the Singularity Action Group. They strike me as slightly unhinged, and I have severe doubts the Singularity is coming anytime soon… just as I doubt the end of the world is nigh. Still, they’re interesting enough. I think they underestimate both the risks and benefits of a Singularity.
What the heck is a Singularity? Uh… well… there are a lot of people who can explain better than me… even the previous link does a better job than me… But anyway, while the concept was first mentioned by
Alvin Toffler in his book Future Shock Alan Turing in the 1940s, later on this computer scientist/SF-writer guy named Vernor Vinge wrote a paper titled What is the Singularity, which is how the notion got popularized, I believe.
Now, the notion of the Singularity is something that is comparable to one of our very basic human instincts. We usually think humans are different from animals in some way that is just naturally there. It’s a hardwired intuition about the world for human beings. And when we look for differences, one of the main ones is the prolific and continuous use of language. Another is the prolific and continuous use of manmade tools (as opposed to simple tools like twigs or rocks, which assorted other primates also sometimes use). Both of these are important, although I think it’s clear language came first.
The acquisition of language, with all of the biological changes in humans that it required and brought out, while on the human scale must have taken ages and ages, seems, as far as I know, to have happened on the geological scale at a blink of an eye. Proto-humans before language simply cannot understand humans armed with language, any more than Neanderthals could understand, say, humans communicating with computers implanted in their heads. Now, imagine the idea of, say, a goldfish trying to understand humans. Not only is understanding it beyond a goldfish, but even trying is beyond the poor thing. (This example is one of Vinge’s own, which he mentions often, like in this interview…)
The Singularity would be something like that: something that transforms the world in such a way that “people” living after it would simply be incomprehensible to people who lived before its time. In Vinge’s fictional work about the Singularity (A Fire Upon the Deep and A Deepness in the Sky, the former of which awaits reading on my bookshelf, the latter of which awaits in a box in Canada), I think there’s some kind of super-brilliant artificial intelligence that is at play. But that wouldn’t necessarily need to be what it is. In Bruce Sterling’s moving and beautiful novel Holy Fire, it’s just the development of enough tech so that people start getting anti-aging treatment and transforming themselves… and anticipating the time of endless lifespan and endless self-modification. There are all kinds of ideas about what a Singularity might be. (Here are some SF writers talking about it; and here at the hippied-out Transhumanism page you can find many links to pages about different conceptions of the Singularity).
Anyway, given the tentative nature of something like the occurence of a Singularity, I find it interesting that there’s actually a Singularity Action Group out there. Not to criticize, but I wonder how these people feel about the already-pressing problems man faces… for example, one article I read recently claimed the solution to Global Warming might be the elimination of poverty, which breeds much dirtier use of fossil fuels. It seems to me there are more pressing issues, and if we don’t get to work sorting those out, we’re going to be in even worse trouble should a Singularity hit.