Things that aren't like us really aren't like us
Posted by aogMonday, 13 June 2005 at 14:48 TrackBack Ping URL

There was some discussion of the Technology Singularity via Instapundit the other day. This is an extrapolation that the rate of technological progress acclerates without bound until it becomes effectively infinite.

I personally don’t believe in the Singularity. I think that the rate of the rate of increase will taper off basically because the more we know, the more we can know. The Singularity requires that our rate of learning grow faster than the set of things to know, which is hardly a given. The expanding amount of knowledge required for further advancement and the growing store of existing knowledge will act against an ever increasing rate of advancement. I think human and computational limitations will prevent true runaway growth.

In fact, one might argue that the biggest impediment to the Singularity is life extension technologies. It’s commonly said that real scientific advancement requires the older generation of scientists to die off — what happens to the rate of progress when that takes centuries instead of decades? One might well wonder if that’s another Fermi Trap, that life extension turns in to a trap that stifles the technological advancement and personal interest necessary for colonizing the galaxy.

What I found most bizarre in the discussions was the worry about a post-human AI “taking over the world”, as if it would form a secret cabal to achieve world domination. I find that very idea ludicrous. Why would a post-human AI want to even waste its time interacting with humans, or being stuck on single planet? Such a construct would effectively be a super advanced alien and I noted before the most likely form of interaction is indifference. I can’t imagine a post-human AI conquering the human world, although I can imagine it re-arranging the surface features for its own convenience, which would be much worse from our point of view. I suspect, though, that Earth wouldn’t be a very useful or interesting place for such a being.

Newton’s Wake has the view that’s close to mine. In it, post-human AIs start competing for world domination, but rapidly get bored with it and abandon the effort for far more profitable (if incomprehensible to us) ventures. One of the more interesting aspects here (compared to similar books) is that humans advance by looting prototypes left behind by the post humans. Normally such stories have the post humans simply deduce, instantaneously, the ultimate secrets of reality. I find idea of post humans leaving behind scrap heaps of experiments far more plausible. On the whole, though, I expect the pace of change to be much slower than the Hard Rapture enthusiasts postulate.

Comments — Formatting by Textile
Ken Tuesday, 14 June 2005 at 13:52

“Itís commonly said that real scientific advancement requires the older generation of scientists to die off ó what happens to the rate of progress when that takes centuries instead of decades?”

People have enough sense not to take the word of centuries-old scientists as gospel?

Seriously, scientists who are wrong don’t have to die - they just have to be proven wrong, and then the wrong notions won’t be taken seriously anymore, particularly by younger scientists or new scientists (not always the same thing in a world with life-extension).

And a respect for some older ideas isn’t necessarily a bad thing. If life extension technology had been available in 1789, and the Founders could address the Progressivist and Socialist nonsense personally, perhaps we’d already have most of our population off-planet.

“One might well wonder if thatís another Fermi Trap, that life extension turns in to a trap that stifles the technological advancement and personal interest necessary for colonizing the galaxy.”

What really stifles the personal interest necessary for colonizing the galaxy is knowing that you won’t live long enough to get there.

Annoying Old Guy Tuesday, 14 June 2005 at 14:20

Scientists are people too and get very emotionally invested with particular theories. It is the same older scientists who control funding, peer review, tenure, research grants, etc., which can go a long way toward stifling contrary results. This will become increasingly significant as news results are tending to require greater computational and/or experimental resources to verify. The question is how fast opinions of deeply held beliefs can change, especially for those who’s status depends on them.

I’m also not in agreement that lack of personal benefit for interstellar colonization is enough to prevent it. After all, colonists will be those who are already aiming for immortality via reproduction.

AbbaGav Wednesday, 15 June 2005 at 09:37

One view of the singularity is that it’s approach will be hastened as artificial intelligences (or supplemented natural ones) enter the intellectual fray. At that point, the dependency on human life spans will be reduced — AI-in-a-dish evolution cycles can simulate the progress that would otherwise require generations of humans to learn and grow from their predecessors, while throwing away the chaff.

But there are a lot of obstacles before we reach that point, for sure. Energy dependency, the effort to maintain the public order (no war, major riots, infrastructure disruptions, etc). I wonder if Vegas carries a line on the Singularity.

Annoying Old Guy Wednesday, 15 June 2005 at 09:56

One of the interesting concepts in Newton’s Wake was that the Singularity occurred for the AIs but not for the humans. If we depend on purely AI for advancement, it’s unclear whether humans would end up participating and we’d be left with the scenario I outlined above, the creation of an effectively alien race with powers beyond our comprehension. Whether it would do us much good is far from certain.

P.S. I fixed your comment while I was at it.

Post a comment