ciphergoth: (Default)
[personal profile] ciphergoth
The question I asked was

"The Church-Turing thesis states that any machine we can really imagine building, certainly any machine that can be built using the physical law that we know, can be simulated on a computer. That includes the human brain, which we agree is a machine. So do you agree with Penrose that there's physical law we don't know that will extend the powers of the brain beyond those of a Turing machine?"

The question I should have asked was

"When you represent what you call Strong AI as being based on the belief that the brain is like a digital computer, that's a deliberate misrepresentation designed to make it seem less plausible. Strong AI is, as you know perfectly well, based on the belief (which you share) that the brain is some sort of machine, and as such is amenable to simulation on a computer. You fuck."

As for the way he misrepresents Dennett... well, anyway, I'm kicking myself because I'll never get the chance again...

Dreamt about a Goth weekend being run by this year's BiCon committee, in a town a bit like Whitby but different; people were bemoaning the absence of the Elsinore. Instead of sleeping in beds, we slept in mattress-shaped tanks of water; they were quite comfy once the heat of your body warmed the water. I went to what I thought was a plenary, but it turned out to be a crisis meeting of the commitee; [livejournal.com profile] adjectivemarcus said I should stay because of being involved in last year. They were trying to call emergency services because of some sort of drug-related medical emergency, but their mobiles weren't getting reception and the landline was tied up because ([livejournal.com profile] babysimon explained to me) some interfering busybody had insisted that the best way to get them would be to dial out and raise them online...

Date: 2002-12-11 04:45 am (UTC)
babysimon: (Default)
From: [personal profile] babysimon
Ah. In that case, I don't think that was the question you should have asked either :)

Maybe you should have asked him why he isn't a proponent of strong AI given that (you say) he believes all the right stuff...

Date: 2002-12-11 07:56 am (UTC)
From: [identity profile] ciphergoth.livejournal.com
He did answer that question, and his answer is weird.

He says "a simulation of digestion isn't real digestion, and a simulation of thought isn't real thought. A machine may be able to provide a perfect simulacrum of thinking in every respect, but it isn't really thinking; it's just simulated thinking."

It makes me think of the idea that some subset of humanity don't have souls...

Date: 2002-12-11 08:23 am (UTC)
babysimon: (Default)
From: [personal profile] babysimon
> a simulation of digestion isn't real digestion, and a simulation of thought isn't real thought

...and of course, digestion and thought are similar enough concepts that the analogy works...

Date: 2002-12-11 04:05 pm (UTC)
From: [identity profile] ex-meta.livejournal.com
A simulation of adding numbers doesn't actually add numbers, naturally. Only I can do that with my holy brain.

Fear my supercomputer!

Date: 2002-12-12 06:38 am (UTC)
From: [identity profile] pavlos.livejournal.com
A simulation of a nuclear strike is not a nuclear strike. A simulation of a nuclear test is a nuclear test. No philosophical paradox here, the only difference is that the real nuclear strike, as for real digestion, affects specific atoms that you have a vested interest in.

Really, I do find this Searle fellow rather dumb. Or rather, I think he's asking an important question such as "At what point does a complex entity acquire perception/feelings/self-perception/whatever?" and he's asking it in an extraordinarily dumb way.

Pavlos

Profile

ciphergoth: (Default)
Paul Crowley

January 2025

S M T W T F S
   1234
5678 91011
12131415161718
19202122232425
262728293031 

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Dec. 26th, 2025 06:59 pm
Powered by Dreamwidth Studios