[Wittrs] Re: . . . Classifying Searle

  • From: "SWM" <SWMirsky@xxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Mon, 15 Mar 2010 17:21:03 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, Gordon Swobe <wittrsamr@...> wrote:
<snip>
>
> I consider Searle's philosophy of mind a flavor of non-reductive physicalism, 
> clearly distinct from any form of dualism. He distinguishes himself from 
> dualists by denying the existence of any non-physical substances or 
> properties.
>

Joe notes that to be a non-reductive physicalist is distinct from being a 
Cartesian dualist and thereby undermines my premise that 'to think that 
consciousness cannot be broken down to constituents that are not, themselves, 
conscious is to be a Cartesian dualist'.

But if a "non-reductive physicalist" is just defined as what Searle is, then 
the issue Joe's point is circular because his denial of the truth of that 
premise hinges on the stipulation that Searle's position isn't that when the 
point of my argument, which includes that premise, is to demonstrate that 
Searle's position IS that!


> > opinions differ as to whether he explains this phenomenological dualism
> > by postulating a property dualism or a dualism of ontological basicness
>
> Neither, I would say.
>
> As Searle puts it, we lose the concept of consciousness when we deny its 
> first-person ontology. But this is not to say we could not do the reduction 
> if we so desired -- only that it would make no sense to do so.
>

One can recognize the utility of speaking about experiences and experiencing in 
first person terms without denying the utility of speaking about these things 
in terms of third person descriptions that have to do with causal relations 
between experiences and brains. It's an unnecessary confusion to suppose that 
the potency of first person talk about experience makes any difference to the 
value of third person descriptions of causal factors.


> Somewhere in his writings he cites mud as an example: when we reduce mud to 
> water and soil, we lose the concept of muddiness, and muddiness is what 
> interests us when we use word mud. In the case of mud, then, it makes no 
> sense to do a reduction. Likewise with consciousness. We can reduce it to 
> third-person physics (as in identity theory), but in reducing it we lose the 
> concept that interests us.
>
> -gts
>

Of course there are levels of observation and analysis. That is inherent in the 
scientific game.

And of course "mud" refers to certain features we experience in our everyday 
lives while "a mixture of water and soil" may have other applications (though, 
of course, we ALSO experience the outcome of mixing water and soil as well as 
the value of knowing what such a mixture yields).

We could, of course, also refer to the more complex molecular behavior of the 
mixture we call "mud" just as we could refer to the behavior of H2O molecules 
under certain ambient conditions as being the referent of (what we call) 
"water" or "wetness". But the referents of "H2O molecules", etc., are 
theoretical constructs, ideas at some remove from our actual experience of the 
phenomena in question. So they will have value in certain contexts but a value 
that diminishes, sometimes to the point of non-existence, in other contexts.

How useful is it likely to be after all to substitute theoretical descriptions 
of molecular behaviors in ordinary discussion that hinges on referencing 
certain features in the world that we experience as direct perceptions?

That we may use different language games to refer to different levels of 
operation in some cases doesn't preclude referring to different levels in the 
same games, too. Even Searle agrees that brains "cause" consciousness. He just 
lapses into confusion when he wants to talk about what consciousness is, as 
though the fact that it is experienced as experience (in a way exclusive to 
each subject) means that one cannot describe it in terms of constituents that 
aren't themselves, conscious -- even if one means to talk about causes. But 
then he stumbles when he doesn't notice that if brains really do cause 
consciousness in a perfectly physical way there is no reason, in principle, 
that computational processes running on computers cannot do so as well (even if 
there may be some empirical reason they cannot).

Searle makes much of the claim that consciousness must depend on the biological 
uniqueness of brains in that article you cited, Gordon, but he provides no 
reason to assume that there is something that brains have about them that 
computers lack except for his argument that computational processes running on 
computers ("programs") just can't cause consciousness, though THAT assertion 
finally hinges on nothing more than the claim that we know they can't because, 
looking inside the CR, there is no consciousness evident!

But THAT only counts as a reason to believe they can't if we really need to 
think consciousness must be evident in the constituents of the CR for them to 
have a causal relation to consciousness (in the way brains or their processes 
are said by Searle to cause consciousness). And we don't have that, UNLESS WE 
ALREADY THINK THAT CONSCIOUSNESS CANNOT BE CONSTITUTED BY WHAT ISN'T CONSCIOUS. 
But then the same problem computers have applies to brains!

But even Searle agrees that brains do it! He's just stuck here in a 
contradiction that is very deepseated in his argument.

SWM

=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: