[Wittrs] Re: The System Level Issue

  • From: "gabuddabout" <gabuddabout@xxxxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Thu, 27 May 2010 21:42:59 -0000

Stuart, er, Sid Caesar, writes:

"Thus I was explaining that consciousness is not being
equated with any given computational process but with an appropriate amalgam of 
them."


If you're going to be that simple, why don't you recognize that this way of 
putting it amounts to the same dilemma:

Searle's biol. nat. is about appropriate physical processes.

Your (and Dennett's) position seems to be about appropriate complexes of 
computation.

Are you conflating physics with computation?  That is still Searle's position 
save the bs about computation naming a natural kind.

Are you speaking of computation in the form of the 'S' in the S/H system?  Then 
it is too abstract.  So you're not?  Then...

Are you suggesting that the complex computation comes in the form of nonS/H 
systems described in brute causal terms?  Then this is not inconsistent with 
Searle's position either.

Is information processing just an "as if" way of speaking about what brains are 
doing such that recursive decomposition exposes on/off switches at the "dumbest 
level"?  Well, if there is no real information processing going on at this 
bottom line dumbest level, then this is also consistent with Searle's position 
that the brain is not biologically doing any information processing.  We do 
information processing given consciousness which enhances our behavioral 
repertoir at a system level higher than the bottom up causation which accounts 
for the causes of consciousness to begin with (eventually, pace skeptics like, 
say, eliminativists!).  If one wants to say that at the bottom level there are 
on/off switches which amount to information processing, then one may just as 
well say that they are interpreting the brain as a digital computer.

Are those on/off switches to be interpreted in a nonS/H way like neurons being 
turned on/off?  If so, this is consistent with Searle's position and not 
_necessarily_ a case of interpreting the brain as if a digital computer.

What is learned from weak AI is how to create sophisticated programs.

Conflating the sophisticated programs with sophisticated physical processes 
amounts to understanding neither the programs nor the underlying biology of the 
brain.

I imagine I have been a little simple too.  Thanks to Stuart for making it 
simple.


Cheers,
Budd








=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: