There was some discussion of the Technology Singularity via Instapundit the other day. This is an extrapolation that the rate of technological progress acclerates without bound until it becomes effectively infinite.
I personally don’t believe in the Singularity. I think that the rate of the rate of increase will taper off basically because the more we know, the more we can know. The Singularity requires that our rate of learning grow faster than the set of things to know, which is hardly a given. The expanding amount of knowledge required for further advancement and the growing store of existing knowledge will act against an ever increasing rate of advancement. I think human and computational limitations will prevent true runaway growth.
In fact, one might argue that the biggest impediment to the Singularity is life extension technologies. It’s commonly said that real scientific advancement requires the older generation of scientists to die off — what happens to the rate of progress when that takes centuries instead of decades? One might well wonder if that’s another Fermi Trap, that life extension turns in to a trap that stifles the technological advancement and personal interest necessary for colonizing the galaxy.
What I found most bizarre in the discussions was the worry about a post-human AI “taking over the world”, as if it would form a secret cabal to achieve world domination. I find that very idea ludicrous. Why would a post-human AI want to even waste its time interacting with humans, or being stuck on single planet? Such a construct would effectively be a super advanced alien and I noted before the most likely form of interaction is indifference. I can’t imagine a post-human AI conquering the human world, although I can imagine it re-arranging the surface features for its own convenience, which would be much worse from our point of view. I suspect, though, that Earth wouldn’t be a very useful or interesting place for such a being.
Newton’s Wake has the view that’s close to mine. In it, post-human AIs start competing for world domination, but rapidly get bored with it and abandon the effort for far more profitable (if incomprehensible to us) ventures. One of the more interesting aspects here (compared to similar books) is that humans advance by looting prototypes left behind by the post humans. Normally such stories have the post humans simply deduce, instantaneously, the ultimate secrets of reality. I find idea of post humans leaving behind scrap heaps of experiments far more plausible. On the whole, though, I expect the pace of change to be much slower than the Hard Rapture enthusiasts postulate.