[Wittrs] Re: Consciousness in Three Lists

  • From: "gabuddabout" <gabuddabout@xxxxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Thu, 13 May 2010 22:14:19 -0000

I'm sorry.   I couldn't resist.  Let me know if I'm muddled or whether you 
enjoy not distinguishing functional properties with 1st-order properties:

--- In WittrsAMR@xxxxxxxxxxxxxxx, "SWM" <wittrsamr@...> wrote:
>
> In response to Joe, here is one version of the processes/functions I suggest 
> would need to be in place for a more robust CR to actually achieve 
> understanding (as in the kind of understanding we recognize in ourselves).


I claim that it is a red herring to say the CR is underspecked.  This claim 
misses the whole point of the CR.  The CR is as robust as a universal Turing 
machine already--you can in principle add any amount of syntactic complexity to 
it without necessarily generating semantics because 2nd-order properties are 
supervenient on 1st-order properties.  So, you will have to add the right sorts 
of 1st-order properties, not 2nd-order properties.  And it is no argument 
against Searle that we don't know how brains do it.  It is simply incoherent to 
suggest that 2nd-order properties add causal weight to nonS/H or S/H systems.  
This last is the upshot without a three step proof, btw.  But it only works if 
you understand it.  Sounds to me that it might be a good tactic to pretend not 
to understand the third premise and go back to focussing on issues of a 
priorism, conceptual truths, and "truths" by definition..

Just to be clear, though, it is no comeback to say that some programs need more 
sophisticated hardware in order to run.  Once one has the distinction between 
software and hardware, one has a system spelled out in such a way that it does 
what it does only given the 2nd-order nature of the functional properties in 
question.  We got good at making these gadgets for a reason.  Yeah, yeah!

You can't add 1st-order properties to 2nd-order properties as if you get more 
1st-order properties.  And add as much 2nd-order functional properties as you 
like to any nonS/H system and you will not increase the brute causality of the 
system.

So it is unclear whether you really think that you're selling strong AI or 
something that in principle Searle isn't arguing against.


Cheers,
Budd

=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: