One person answered this question the first time around by referencing the Turing Test and Searl's "Chinese Room" thought experiment. Here's the full story if you aren't familiar:
http://en.wikipedia.org/wiki/Chinese_Room
Reading the remainder of the article, you'll find that many philosophers and scientists have replied to Searl's argument. Most of the replies are based on the fact that Searl equivocates between himself sitting inside the box and the "system" [Searl + Box + Lookup Tables, etc.]. The "System" DOES understand Chinese, despite the fact that Searl himself does not.
I should clarify that I'm familiar with this debate, and was hoping that the question would fuel personal responses, not references to other's works. The question; rephrased:
In what sense is "consciousness" MORE than a "system" ("you") that "believes" itself to have something you refer to as "consciousness" ("Knowing what it is like to be you")?
...and if a computer can too, then is it conscious?
2006-08-09
21:31:23
·
4 answers
·
asked by
Jon
3
in
Arts & Humanities
➔ Philosophy