[Wittrs] Re: Searle's CRA and its Implications

  • From: "SWM" <SWMirsky@xxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Tue, 16 Mar 2010 20:46:48 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, Gordon Swobe <wittrsamr@...> wrote:

> --- On Tue, 3/16/10, SWM <wittrsamr@...> wrote:
> > [Recall my point that this is about how consciousness can
> > be conceived, how we can imagine it! Note that I have been
> > stressing the point that the inability to imagine it in the
> > way Dennett proposes, or the unwillingness to do so, hangs
> > on an implicit presumption that consciousness, or, in this
> > case, understanding, cannot be reduced to more basic
> > constituents that are not themselves instances
> > of understanding. I have stressed that Searle's argument
> > hinges on precisely this insistence, that because there is
> > no understanding to be found in the Chinese Room, no
> > understanding is possible.
> Understanding does exist in the CR. Have you read the target article?

There are MANY articles and many books from Searle on this subject, a subject 
that he built his reputation on. And I have read many of them though I can't 
swear I've read everyone you have in mind. What is your "target article" and 
what do you think it is saying that I am missing? That the man understands? 
Well of course he understands his instructions and how to follow them but he 
doesn't understand Chinese which is the issue here, i.e., is understanding 
anything more than getting the answers right in some rote kind of way?

I would agree that it is but I would also say (and have said) that the lack of 
such understanding in the CR as Searle has specked it says nothing about what 
any other R (configuration of the same constituent activities) could understand 
or the level of understanding it could reach.

> The man implements both an English and a Chinese version of the program. He 
> manages of course to understand the English using exactly the same tools he 
> had with the Chinese. How do you explain that discrepancy?

What discrepancy? I am not suggesting he understands Chinese and English! 
Indeed, I am not suggesting the CR represents a system that understands 
Chinese. I AM saying that the CR is inadequately specked to understand Chinese 
(or English for that matter).

> It seems the man has full competency to understand Chinese if in fact syntax 
> gives semantics. But alas syntax does not give semantics.

Note that I do not assert that the CR or the man in the CR understand Chinese. 
I agree (and have always agreed) that they don't. The issue is whether the CR 
could, and I argue it can if it were more adequately specked. My point is that 
the understanding in question is a system feature not a process feature and 
that Searle's argument confuses the two. That the constituent processes in an R 
like the CR do not have understanding in themselves doesn't mean that a more 
complex system consisting solely of such constituents would not have it.

> I went back and forth with Budd on this point on another list; he finally 
> convinced me that the English semantics in the room plays a necessary role in 
> the experiment for precisely the reason above. It's a control, so to speak.
> -gts

But it has no implication for the CPU. The point of the CR is that it simulates 
a central processing unit. If it didn't, if it were really about a mind within 
a mind (Searle in the room), it wouldn't even appear to demonstrate what Searle 
claims for it. It wouldn't even fool those going with the standard intuition 
that minds aren't physical or part of the physical world.


Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: