Wednesday, September 29, 2010

Thoughts on the Chinese Room

John Searle's Chinese Room Thought Experiment can be summarized as follows:
  1. Assume an algorithm to perfectly imitate native Chinese speaker does exist.
  2. In an isolated room, Searle himself manually execute the algorithm to conduct a seemingly intelligent conversation with the outside world.
  3. Since neither Searle, the Room nor the Algorithm understands Chinese, we conclude that no matter how closely a symbol manipulator imitates human intelligence, it cannot have a mind.
Many replies to this thought experiment have been proposed, but none is satisfactory enough. Indeed, Searle has responded to most of the replies and showed the conclusion still holds.

It just occurred to me that could the problem lie in the word "understand"? How does one learn Chinese? Could the process of execution of the algorithm be a process of learning Chinese for Searle?

Hmm...

No comments:

Post a Comment