I think all of these are covered in my post - would you agree?
Climate change and such could kill us all, which would definitely put the kibosh on the whole thing. If it doesn't kill us all, it could delay it a great deal; it's not impossible that we could survive but never get back to making progress again, but I don't think that's likely.
"Meaningfully" cleverer than us isn't a strict requirement - if it is "merely" a million times faster than us it can compress the time from Socrates to today into less than a day, and if it's "merely" the equivalent of a billion human-level intelligences each working at this 1E6 accelerated rate, it will have a significant intellectual advantage.
Any given AGI might successfully be stopped by violence; this broadly comes under the heading of humanity deciding not to build one. However, we may eventually - wisely or otherwise, deliberately or otherwise - allow one to run for long enough that it makes a big difference to our future.
no subject
Date: 2012-02-21 05:19 pm (UTC)Climate change and such could kill us all, which would definitely put the kibosh on the whole thing. If it doesn't kill us all, it could delay it a great deal; it's not impossible that we could survive but never get back to making progress again, but I don't think that's likely.
"Meaningfully" cleverer than us isn't a strict requirement - if it is "merely" a million times faster than us it can compress the time from Socrates to today into less than a day, and if it's "merely" the equivalent of a billion human-level intelligences each working at this 1E6 accelerated rate, it will have a significant intellectual advantage.
Any given AGI might successfully be stopped by violence; this broadly comes under the heading of humanity deciding not to build one. However, we may eventually - wisely or otherwise, deliberately or otherwise - allow one to run for long enough that it makes a big difference to our future.