ciphergoth: (Default)
[personal profile] ciphergoth
Clarification: By "smart" I mean general smarts: the sort of smarts that allow you to do things like pass a Turing test or solve open problems in nanotechnology. Obviously computers are ahead of humans in narrow domains like playing chess.

NB: your guess as to what will happen should also be one of your guesses about what might happen - thanks! This applies to [livejournal.com profile] wriggler, [livejournal.com profile] ablueskyboy, [livejournal.com profile] thekumquat, [livejournal.com profile] redcountess, [livejournal.com profile] thehalibutkid, [livejournal.com profile] henry_the_cow and [livejournal.com profile] cillygirl. If you tick only one option (which is not the last) in the first poll, it means you think it's the only possible outcome.

[Poll #1103617]

And of course, I'm fascinated to know why you make those guesses. In particular - I'm surprised how many people think it's likely that machines as smart as humans might emerge while nothing smarter comes of it, and I'd love to hear more about that position.
Page 1 of 3 << [1] [2] [3] >>

Date: 2007-12-10 11:30 am (UTC)
From: [identity profile] seph-hazard.livejournal.com
I'm not sure how to answer this and need to think about it more [grin] I suspect it won't be the next forty years - it'll be further in the future than that. A hundred, maybe, but this won't happen in forty.

Date: 2007-12-10 11:38 am (UTC)
djm4: (Default)
From: [personal profile] djm4
I tend to feel that the existence of Google means that this has already happened. Google (and Altavista, Lycos and Yahoo before it) is a 'vastly smarter' information processor and indexer than any human, and has transformed our ability to find information beyond all recognition in the past 12 or so years.

Even if you don't feel Google's smarter than a human yet (just faster), I suspect in the next ten or so years it will become so.

Date: 2007-12-10 11:39 am (UTC)
From: [identity profile] despina.livejournal.com
It depends what you mean by 'smarter'! AI stuff is coming along in leaps and bounds but there are some things I don't think machines will ever be able to do as 'smartly' as humans can.

Also depends on the humans, though, I suppose.

Date: 2007-12-10 11:48 am (UTC)
From: [identity profile] battlekitty.livejournal.com
Mu: Problem is that the economic and social situation will be such that in 40 years there won't exist the ability to produce science of that calibre.

Cynical much? me?? :)

Date: 2007-12-10 11:53 am (UTC)
From: [identity profile] thekumquat.livejournal.com
As no-one else has mentioned it yet, it depends what you mean by 'smart'. Computers already have better memory and analytical abilities than humans, but I figure what is needed to be 'smart' is the initiative to decide what to research/remember/analyse. I don't see computers being able to synthesise solutions independently of human programmers.

On the other hand, that could just be my failure of imagination, given that our brains had to evolve from pattern recognition and response to current creative synthetic abilities somehow and I guess there's no reason why silicon 'neurons' couldn't do the same, but I doubt it in the next 40 years.

Date: 2007-12-10 11:54 am (UTC)
From: [identity profile] purplerabbits.livejournal.com
I suspect that in the process of developing machines that as are smart as humans in some ways but not in others, we will discover new and interesting things about what intelligence is and why AI has been such an intrasigent problem that I can't even imagine. And then civilisation will collapse.

Date: 2007-12-10 11:57 am (UTC)
zotz: (Default)
From: [personal profile] zotz
It does depend on what you mean, but mainly I suspect that this is going to be very like fusion power, which I suspect is why you picked 40 years as a term.

Date: 2007-12-10 12:00 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
Actually I picked it because I think most of us expect to live at least another forty years.

Date: 2007-12-10 12:02 pm (UTC)
zotz: (Default)
From: [personal profile] zotz
I wouldn't count on it, personally.

Date: 2007-12-10 12:03 pm (UTC)
aegidian: (cogs)
From: [personal profile] aegidian
Mu - incomplete definition of 'smarter'.

It's a given that some machines are already vastly better at performing some tasks previously performed by human 'computers' already, and this has transformed the human experience of much of the world.

If by "machines vastly smarter than humans" you mean machines capable of exhibiting characteristics indistinguishable from that of a human intellect and yet surpassing any "smart" human intellect, then I doubt it will happen without several considerable shifts in what is currently considered AI. These shifts may occur, but I suspect they may be as elusive as sustainable fusion power.

Date: 2007-12-10 12:19 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
Have updated post to indicate the kind of "smarter" I mean.

Date: 2007-12-10 12:24 pm (UTC)
From: [identity profile] drdoug.livejournal.com
it depends what you mean by 'smart'

Yes! Absolutely my position. And we could argue about what it means for a human or a machine to be smart, but one thing I do feel confident in predicting is that people in 40 years' time will come to a very different set of answers. 40 years ago takes us back to perhaps the heyday of AI, and our ideas now about artificial and human intelligence are quite different.

My main problem with the options: I'd say that machines are already smarter than people, and this has already transformed everything in the world.

Leaving aside the abstract stuff, none of the physical developments that enable our current world to happen - agriculture, production, travel, communications - are possible at anything like the current scale without smart machines to do the required thinking for us. The C19th clerical revolution enabled the industrial revolution; the C20th IT revolution has enabled an even more profound transformation of society.

The rate of change is increasing. (I'd guess it's still positive down to third differentials.) I'm not a Singularity person, but any trend that can't continue won't. That can't. At least, I can't think it can - but maybe we can invent machines that will enable us to do that thinking.

I didn't predict how profoundly the world has changed in the last twenty years. I wasn't a lot better at predicting what was going to happen over the course of the last year. (I wasn't quite so wrong, but only because things can't change so substantially over just a single year.) I don't think I'll do any better over a longer timescale.

Except ... ask me again in 39 years, and machines will have got smarter so that I can give a much better answer!

Date: 2007-12-10 12:27 pm (UTC)
djm4: (Default)
From: [personal profile] djm4
I think my answer's still the same, although I expect that the most profound changes won't come from machines being 'our' sort of smarter, even though I believe they will be.

Date: 2007-12-10 12:31 pm (UTC)
From: [identity profile] drdoug.livejournal.com
I'm surprised how many people think it's likely that machines as smart as humans might emerge while nothing smarter comes of it

I don't think that's very likely but it's just about conceivable. I can imagine an argument along the lines that you can't build something that's smarter than you yourself are (for some definitions of smart). It'd be a bit of a woolly argument to my mind, but it might have some force. I also think it's possible (but not at all likely) that there turns out to be some fundamental limit to how general-purpose-smart things can be made, and humans are already at it.

I suspect, though, that most people holding that position a) don't hold it very explicitly or consideredly (if that's a real adverb), and b) are somewhat influenced by fear of the consequences if the process doesn't stop at the point they claim it will.

Not me! I, for one, welcome our new machine overlords.

Date: 2007-12-10 12:59 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
Even if it were only as smart as us in some deep sense, it might still think a million times faster than we do, and thus give any question a lifetime of thought in half an hour, and that would be a pretty substantial change.

Date: 2007-12-10 01:01 pm (UTC)
From: [identity profile] xquiq.livejournal.com
I don't think that it's likely that science will focus on creating a machine with the sort of general purpose smarts that would place say, above the average human on an IQ bell curve, even with machine learning.

I'm not saying that it's not possible, but rather that developments will tend towards a particular purpose and thus we may have machines that are vastly superior in some areas, but which will have massive gaps in others. Thus, I'm not convinced that a general comparison between machines & humans will truly be meaningful within the specified time period.

Date: 2007-12-10 01:01 pm (UTC)
From: [identity profile] itsjustaname.livejournal.com
Are you assuming a static level of human intelligence?

For that matter, how do you fully separate the two? If Google is smarter than me, but I'm smart enough to use Google, does that make me smarter than before?

Date: 2007-12-10 01:05 pm (UTC)
djm4: (Default)
From: [personal profile] djm4
It might. But intelligent thought might well require interaction with the real world, and it may not be possible to speed up how quickly that happens in any useful way.

See also 'control systems engineering: how to make a system unstable by making it too sensitive to small changes, and too fast to respond'. ;-)

IMHO

Date: 2007-12-10 01:07 pm (UTC)
From: [identity profile] conflux.livejournal.com
This all comes down to the tricky question of what intelligence actually is and on how can you compare different forms of intelligence. Machines will continue to be developed that can do more and more that appears intelligent. This will have a big impact on the world. These machines will not be as good at doing the things that human intelligence does well though, even in 40 years time.

Separation anxiety.

Date: 2007-12-10 01:10 pm (UTC)
aegidian: (cogs)
From: [personal profile] aegidian
Blow Google, we're smart enough to create pocket calculators, orreries, and sharpened stone tools. And yes, since the tools assist us in performing the same tasks more efficiently, they do make us smarter.

We have machine aided smarts, which brings an interesting problem in demarkation - the tool is not smart, we have the smarts. How complex does a tool have to be for it to be considered separately smart from its user(s)?

Date: 2007-12-10 01:15 pm (UTC)
From: [identity profile] ergotia.livejournal.com
I guess my simple and perhapa simplistic view is that machines as smart as humans are only gonna change the world as much as humans! As for smarter, I really cant see that we know enough about what "smart" actually means yet to be ablwe to theorise about the rest.

Date: 2007-12-10 01:27 pm (UTC)
From: [identity profile] ciphergoth.livejournal.com
Well, all right, but it might still have a useful thing or two to say once it's read every scientific paper and every book ever published.

(moved to correct comment, sorry!)

Date: 2007-12-10 01:31 pm (UTC)
From: [identity profile] mistdog.livejournal.com
I don't think smart computers are going to change everything in the world in 40 years, because many things in the world are not so easily changeable. It takes a lot more to change some things than just understanding problems (and even generating solutions).

I'm thinking of, for example, the WHO's global polio eradication programme which has been running for about 20 years at this point and still isn't there. And that's a problem where the solution is very clear.

Date: 2007-12-10 01:31 pm (UTC)
From: [identity profile] jhg.livejournal.com
Can't really answer that - all the studying I've done leads me to conclude that if any 'AI' is developed, it will be so vastly different to human intelligence as to be unsuitable for that kind of comparison.

I think it likely that independent, self-supporting robots of some sort will be developed; I strongly doubt that they'll be capable of passing a Turing Test, i.e. of holding a 'normal' conversation with you.

Date: 2007-12-10 01:33 pm (UTC)
From: [identity profile] martling.livejournal.com
I'm going to skip the semantic debate by interpreting "as smart as humans" to mean "able to do everything a human mind can do, at least equally as well".

I don't think that's going to happen in 40 years, just based on looking at where AI has got in the last 50. Raw computing power has not for the most part been the problem, and developments in everything other than raw power are usually slow.

However, whilst not all, I fully expect to see far more of the many things a human mind can do performed equally well by machines within this time. And that may ultimately be enough in itself to bring on what you're alluding to - in particular because such a huge and well-funded sector of research is already focused on machines to help us design the hardware and software of other machines.
Page 1 of 3 << [1] [2] [3] >>

Profile

ciphergoth: (Default)
Paul Crowley

January 2025

S M T W T F S
   1234
5678 91011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 24th, 2026 04:24 am
Powered by Dreamwidth Studios