[Wittrs] Re: Ontologically Basic Ambiguity: Mode of Existence

  • From: "SWM" <SWMirsky@xxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Mon, 22 Mar 2010 21:36:21 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, Gordon Swobe <wittrsamr@...> wrote:
<snip>


> The CRA illustrates that EVEN IF the CPU or the system as a whole had
> consciousness and intentionality and all the good things that you and I
> have, including billions of powerful interconnected brain cells, an
> English education, and a genuine desire to understand Chinese, it STILL
> would not understand Chinese from manipulating symbols according to
> syntactical rules.
>
> Our man has everything we could possibly mean by "strong AI", but still he
> doesn't get it.
>
> -gts

That's because in the Chinese Room our man is ONLY a CPU, whatever else he may 
be outside the room. But if you think Searle intended the man's understanding 
to play a role in the room besides enabling him to function AS a CPU it should 
be easy enough to demonstrate. Can you cite something from Searle, himself, to 
this effect so we can take a look at it? Thanks.  -- SWM

=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: