Question
Why does Venkatesh Rao think the world is going to end?
Answer
Wonder what's up with this spate of questions directed personally to me? I suspect a complicated troll. Oh well.
I no longer believe the world will end. At least not in a short enough time frame for the story to be interesting. Instead, I've decided to carve out and occupy a new position inbetween the Singularity and Collapsonimics.
I haven't yet named it (but I have dibs, don't steal this naming opportunity from me), but the basic idea is that the upward pressure of hacks to keep the system running will exactly counteract the downward pressures of collapse via entropy-gravity. The curve will look like a plateauing S, with a noisy plateau phase. It won't look like an exponential or a peak-and-collapse. Today, we have a world where every bug squashed introduces three new ones. I believe we'll get to the stable, long-decline rate of 1:1.1. Our hacking will get better. We'll introduce 11 bugs for every 10 we squash.
We'll keep the whole shabby mess limping along indefinitely, but a sudden shock could still send us to the Idiocracy future.
The upward push towards a Singularity seems increasingly unlikely to me. I keep looking and finding nothing more than the simplistic extrapolation of an exponential curve. I don't think "self-improving AI program" is a meaningful construct.
I no longer believe the world will end. At least not in a short enough time frame for the story to be interesting. Instead, I've decided to carve out and occupy a new position inbetween the Singularity and Collapsonimics.
I haven't yet named it (but I have dibs, don't steal this naming opportunity from me), but the basic idea is that the upward pressure of hacks to keep the system running will exactly counteract the downward pressures of collapse via entropy-gravity. The curve will look like a plateauing S, with a noisy plateau phase. It won't look like an exponential or a peak-and-collapse. Today, we have a world where every bug squashed introduces three new ones. I believe we'll get to the stable, long-decline rate of 1:1.1. Our hacking will get better. We'll introduce 11 bugs for every 10 we squash.
We'll keep the whole shabby mess limping along indefinitely, but a sudden shock could still send us to the Idiocracy future.
The upward push towards a Singularity seems increasingly unlikely to me. I keep looking and finding nothing more than the simplistic extrapolation of an exponential curve. I don't think "self-improving AI program" is a meaningful construct.