Artificial Intelligence and the year 2047
Dec. 10th, 2007 11:25 amClarification: By "smart" I mean general smarts: the sort of smarts that allow you to do things like pass a Turing test or solve open problems in nanotechnology. Obviously computers are ahead of humans in narrow domains like playing chess.
NB: your guess as to what will happen should also be one of your guesses about what might happen - thanks! This applies to
wriggler,
ablueskyboy,
thekumquat,
redcountess,
thehalibutkid,
henry_the_cow and
cillygirl. If you tick only one option (which is not the last) in the first poll, it means you think it's the only possible outcome.
[Poll #1103617]
And of course, I'm fascinated to know why you make those guesses. In particular - I'm surprised how many people think it's likely that machines as smart as humans might emerge while nothing smarter comes of it, and I'd love to hear more about that position.
NB: your guess as to what will happen should also be one of your guesses about what might happen - thanks! This applies to
[Poll #1103617]
And of course, I'm fascinated to know why you make those guesses. In particular - I'm surprised how many people think it's likely that machines as smart as humans might emerge while nothing smarter comes of it, and I'd love to hear more about that position.
no subject
Date: 2007-12-10 12:31 pm (UTC)I don't think that's very likely but it's just about conceivable. I can imagine an argument along the lines that you can't build something that's smarter than you yourself are (for some definitions of smart). It'd be a bit of a woolly argument to my mind, but it might have some force. I also think it's possible (but not at all likely) that there turns out to be some fundamental limit to how general-purpose-smart things can be made, and humans are already at it.
I suspect, though, that most people holding that position a) don't hold it very explicitly or consideredly (if that's a real adverb), and b) are somewhat influenced by fear of the consequences if the process doesn't stop at the point they claim it will.
Not me! I, for one, welcome our new machine overlords.
no subject
Date: 2007-12-10 12:59 pm (UTC)no subject
Date: 2007-12-10 01:05 pm (UTC)See also 'control systems engineering: how to make a system unstable by making it too sensitive to small changes, and too fast to respond'. ;-)
no subject
Date: 2007-12-10 01:27 pm (UTC)(moved to correct comment, sorry!)
no subject
Date: 2007-12-10 02:15 pm (UTC)no subject
Date: 2007-12-10 03:12 pm (UTC)Oh, absolutely - but I'd argue that for pretty much any sensible definition of "smart" the thinking speed is a key factor, so what you're talking about there is unequivocally machines being smarter than us. Not for nothing are "quick" and "slow" used to connote levels of intelligence.
no subject
Date: 2007-12-10 03:27 pm (UTC)