[Wittrs] Amplifying - Re: How Does Causal Reduction Entail Ontological Reduction?

  • From: "SWM" <SWMirsky@xxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Thu, 01 Apr 2010 03:36:08 -0000

In responding to Joe in one of these threads, I gave short shrift to one of his 
points which, I think on reconsideration warranted a lot heavier shrift:

> --- In Wittrs@xxxxxxxxxxxxxxx, Joseph Polanik <jPolanik@> wrote:

> >I said some causally reductive explanations ARE also ontological
> >descriptions and that Searle simply misses that when he makes the
> >distinction between causal explanations and ontological descriptions.
> Searle says that consciousness is causally reducible to the brain. what,
> is that an ontological description of?
> Joe

My Reply:

It is if the point is to track back the phenomenon to what it consists of, in 

I should have offered something more, along these lines:

One of Searle's problems is that he isn't very clear as to what he is saying 
consciousness causally reduces to in brains, i.e., brains or brain events and, 
if the latter, what they consist of. Needless to say, a genuinely causal 
reduction to brains will reduce what we mean by "consciousness" to something 
along the lines of brain processes though Searle's failure to give an account 
of what he thinks these consist of, besides being a function of biological 
activity in brain cells, etc., leaves the question of the reduction open.

In fact, Searle seems to rely on this vagueness to avoid coming to grips with 
the fact that he asserts of consciousness that whatever it is computers do can 
never conceivably be sufficient. He does try to draw a distinction between the 
abstractness of computation and the concreteness of brains but, of course, 
computers are just as concrete in their implementation of programs (just as 
brains are in their implementation of DNA "programs").

The problem in Searle's account is it fails to be specific enough to get at the 
possibility that what he means by "intentionality" or "understanding", or any 
of the other features we commonly associate with consciousness, are a special 
class of "properties" that attach to certain physical events and not others (as 
Walter put it on the other list). The question then is are these "properties" 
like what we mean by "system properties" (a function of a complex interplay of 
multiple processes in an overarching orchestration of events), since it is 
perfectly conceivable that every process is, itself, a system, too? Or does he 
have in mind that these properties are simply irreducible to anything more 
basic than themselves in some mysterian like way (Walter at one point stated 
that that was his position).

If one holds this latter view of these so-called "process properties" then one 
is back to a dualist scenario since we are saying there are two classes of 
physical phenomena, physical properties and these other things which we may or 
may not want to call "mental properties" (but which it would be fair to say is 
what they would have to be).

My view is that Searle never comes to grips with the implications of his overly 
vague account of consciousness being causally reduced to brains. If he did, he 
would, I think, have been forced to see the inherent contradiction inherent in 
an account of consciousness as being a system property in brains (about which 
he is vague, perhaps intentionally so) but not in computers (to which, though 
failing to articulate it precisely, he seems firmly committed as shown in the 


Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts:

  • » [Wittrs] Amplifying - Re: How Does Causal Reduction Entail Ontological Reduction? - SWM