[Wittrs] Re: Dennett's paradigm shiftiness--Reply to Stuart

  • From: "SWM" <SWMirsky@xxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Sun, 28 Feb 2010 01:46:01 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, "gabuddabout" <wittrsamr@...> wrote:
> > <snip>
> >
> > > All parallel processing can be implemented on a serial computer.  There 
> > > simply is nothing more by way of computation that can be done in parallel 
> > > that can't be done serially.
> > >
> >
> >
> > This misses the point again.

>
> No it doesn't.

Yes it does.


> It targets your point that parallel processing offers us something more 
> COMPUTATIONALLY than serial computing.


Depends what is meant by "computationally" and "more". It seems to me you are 
confusing the question of the quality of the process (its nature), i.e., that 
it is an algorithm, with the quality of the system (consisting of many 
processes doing many different things). A massively parallel computational 
platform is still just a computer. But it has capacities that even a pretty 
darned fast serial machine doesn't.

The point is what is consciousness, i.e., is it a feature (or features) of a 
certain kind of system or is it something that cannot be reduced to that?

If it can be reduced to that in brains then at least in principle it can be in 
computers, too (even if there are empirical reasons for why it might not 
actually work -- a totally different issue than Searle's logical claim).

The thesis that a computer would have to have the same capacity(ies) as a brain 
to replicate consciousness is not a claim that brains have something that 
computers lack but that brains operate like computers in certain relevant ways 
and that a computer that can be brought to that level of operation, no matter 
how many added processors were required, would be as conscious as a brain.

But, of course, if your position is that computational processes per se are not 
capable of producing the features we call consciousness because, well, they 
aren't conscious ("nothing in the Chinese Room understands Chinese -- John 
Searle), then, of course, you will deny the possibility, as Searle does. But 
then that is dualistic, admitted or no.


> That is decidedly false and Searle's CR is equivalent to a UTM and ALL 
> possible parallel processing DEFINED IN COMPUTATIONAL TERMS is also 
> equivalent to what can be done serially with a UTM.
>

Searle's CR is specked as a rote responding device of such remarkable facility 
that it will always seem to be understanding in its replies. Aside from 
Dennett's point that this isn't even conceivable, we can still grant it for 
argument's sake. But in doing so we are left with a machine that is matching 
symbols to symbols with no understanding, by definition. But to if to 
understand is to have one of the features of consciousness then the question is 
what does that feature consist of? When looked at closely, it is clear it isn't 
just rote symbol matching according to mechanical rules. What's required is the 
capacity to relate one thing to many things in an ongoing cascade of 
connections. And that is excluded from Searle's CR. It's a bicycle that we're 
told can fly. But it doesn't matter what we're told. A bicycle lacks all the 
accouterments, the instrumentalities, needed for flying and so, no matter how 
many times we're told it can fly, it still can't.


> Also, I am not missing your point when you make a different one.  You 
> assimilate parallel processing in your lexicon to physical
> processes.


All computers are physical platforms and thus operate physically, i.e., have 
physical processes going on. Serial machines have it just as much as parallel 
processing machines.


> The claim is vacuous actually but you don't know it.


You just don't get it.



> To the extent it is about computation it is vacuous.  To the extent it is 
> about physical processes, Searle doesn't disagree with your change of topic.
>


So are you saying Searle agrees with Dennett's thesis that consciousness can be 
replicated on massively parallel processing computers then? Is THAT your 
position?


> No one is ever going to find that some process or other is intrinsically 
> computational.
>


We're talking though about computational processes not what is "intrinsically 
computational". Everything is what it is and that is usually lots of different 
things. That there are computational processes going on in computers is 
indisputable. The issue is whether they are sufficiently like what is happening 
in brains to accomplish the same thing.


> And on the other hand, everything under the sun can be given a computational 
> description.
>
> So one can say that the stomach does information processing.
>

Yes, but so what? We're talking about brains and computers.


> The upshot of so saying is that it makes it difficult to distinguish the 
> truly mental from nonmental.
>

?


> And this is the upshot of the systems reply as a reply to Searle's CRA.
>

Dennett's response in that text I offered from Consciousness Explained, while 
longwinded, is effective.


> If, on another hand (I'm going to be Jack Handy today!), one wanted to say of 
> A (AS IN a TYPE OF) systems reply that it amounted to merely the claim that 
> nonconscious physical processes cause consciousness, then one wouldn't be 
> saying anything contradicting Searle's biological naturalism.
>

The question is what is consciousness, what is it that we are saying is caused? 
If it's just certain features produced by certain kinds of process-based 
systems running in a certain way, then there is no grounds for eliminating 
computers as a possibility for doing this in principle. But, of course, that's 
what Searle is trying to do.


> Immediately below one can see what a mess Stuart creates by dipping back and 
> forth between comments on computation and comments on physical processes 
> simpliciter. Just look:
>
> Stuart writes:
>
> "The issue is that, if consciousness is a certain kind of process-based 
> system, then you need to have all the parts in place, even if they all 
> consist of different computational processes doing different things and it 
> takes a parallel platform to do this."
>
>
> Notice the "even if" above.  As already explained, everything can be given a 
> computational description.  Therefore, while the "even if" looks like it's 
> doing some work above, we know independently that it is an idle phrase given 
> the vacuity of claiming that some physical process is intrinsically 
> computational.
>


This is an example of nonsense (alluding to a nearby discussion). No one is 
making any claims about anything "intrinsically computational". That's your 
fantasy. This is about one thing: can consciousness be produced synthetically 
on a computational platform or can it not?

While the question itself is empirical in nature, Searle attempts to give a 
logical answer in the negative. But that logical answer fails because, if 
consciousness is explainable as a process-based system operating in a certain 
way with certain capacities, then there is no reason, in principle, that a 
computationally based system of this type can't do it. On the other hand, if a 
computer can't do it because its processes aren't, themselves, conscious (as 
Searle tells us) then the same denial would apply to brains unless you hold a 
dualist position (thinking that consciousness is an ontological basic).


> The same waffling happens immediately after the above quote:
>
> Stuart writes:
>
> "That one can do each of the processes in a serial way, too, isn't the issue 
> because one can't do it all in the way that's required, i.e., by running a 
> sufficiently complex system with lots of things interacting simultaneously, 
> in parallel, using a serial platform. (PJ has argued that a really, really, 
> really, really, etc., fast system could do what a parallel system could do 
> even if we have no such system or the possibility of building one and I am 
> agnostic on that."
>
>
>
> Stuart is agnostic because he really doesn't understand the following:
>
> 1. The CR is a UTM.
>

Show how this is relevant.


> 2. All parallel processing can be done on a UTM.
>

So what? The issue is the process-based system, not particular processes in the 
system.


> 3. All parallel processing is different from serial processing only in name 
> (computationally) or not.  If not, the only extra here is in noncomputational 
> terms.
>

Actually, on the Analytic list we read an interesting paper (I've lost the link 
to it unfortunately) which pointed out that parallel processing introduces more 
than just greater speed. By the trick of siumultaneity and interactivity, it 
introduces unpredictability which means that a sufficiently complex system may 
have the ability to program itself within certain limits in the same way brains 
do.


> So Stuart really has no beef with Searle's biological naturalism because in 
> Stuart's lexicon, parallel processing simply means the same thing as what the 
> brain does if the brain is described as a parallel processor.
>


If the brain operates as a parallel processor then Dennett is correct in which 
case a computational consciousness would be achievable. So why do you think 
Searle denies that possibility?


> Searle's point is that this claim is vacuous.


You mean if someone can build a conscious machine on Dennett's model (or even 
try and fail to do it) that is "vacuous"? How so?


>  Anything under the sun can be given a computational description, and it 
> doesn't matter (COMPUTATIONALLY SPEAKING) that one thing is a serial 
> processor and another a parallel processor.
>

You are mixing the issues up again.


> Stuart is right, however, to see that parallel processing sounds like a 
> better candidate for mirroring what the brain does because such processing 
> ____sounds____ a bit more realistic as a physical system than serial 
> processing happening with software running on hardware, what I abbreviate by 
> S/H (SH for short).
>

>
> The same confusion simply happens over and over with Stuart,


One of us is clearly confused, Budd.


> as one can see with what he immediately says after the above quote:
>

> Stuart writes:
>
> "It may, indeed, be possible to achieve synthetic consciousness on a serial 
> processor running at super-duper speed. But so what? The issue is what does 
> it take to do it in the real world and, for that, parallel processors are a 
> way more realistic option.)"
>

> Now, I'm coming to believe that Stuart simply is playing a game where he 
> doesn't care that he speaks as sloppily as he does.


You're getting insulting again. I guess you can't help it but I won't keep 
playing with you if it gets much worse.


> The options are that he really is sloppy or that he doesn't care because the 
> point is to see who can correct him best.  Notice that the way he phrases 
> things, he allows for it to be possible to create consciousness via serial 
> computation.  He distinguishes the type of consciousness by saying "synthetic 
> consciousness" by which he means AI, which, don't ya know, Searle allows is a 
> possibility without thinking that SH is a coherent candidate.
>

Searle allows "weak AI" of course but claims that isn't to produce real 
consciousness, only a simulation ('You can simulate a hurricane on a computer,' 
he says, 'but it won't make you wet.')

Searle also allows that it may be possible to build a machine that replicates 
what brains do but, he insists, this will not be done using computation such as 
is run on computers because of what his CR demonstrates. But, of course, all it 
demonstrates is that you can't build a bicycle and expect it to fly.


> Often, Stuart goes straight from Searle's denial that SH is a coherent 
> candidate for causing semantics or consciousness to the claim that Searle 
> must have some nonprocess-based conception of consciousness.
>


See my past arguments about the only way the CRA conclusion can be drawn.


> Now, notice that Stuart thinks parallel processing more realistic as a way of 
> thinking how the brain works.
>
> That can only be because he likens parallel processing more to physics 
> compared to serial processing.
>


Huh?


> He shifts back and forth from serial processing to parallel processing 
> because he already knows that both are equal in computational terms while 
> only one looks more like what the brain may be doing.
>

I don't shift at all. I am talking about a process-based system that requires a 
parallel processing platform, at least in real time (not superduper time). 
Nowhere have I ever suggested that a serial processor performing one process 
after another sequentially could be expected to replicate what brains can do in 
real time.


> But Searle's point is that one doesn't discover information processing in the 
> physics.
>

And Searle's point is irrelevant to the issue of what one can do with computers 
and computational processes running on them.


> Similarly, Dennett uses the intentional stance to describe levels of 
> intentionality below the level at which we have it until the bottom level is 
> all about physical processes without intentionality at all.
>
> He calls it recursaive decomposition.
>
> Searle's naturalism is simply more brutal.


It's more confused on this issue of ontological basicness. That is, it's 
dualistic while imagining it isn't.


>  It is a humunculus fallacy to suppose levels of intentionality other than 
> the conscious level--unless one is so eliminative that they aren't even going 
> to try for a theory of semantics, effectively denyiong the second premise 
> that minds have semantic contents.  Let Dennett deny this and get away with 
> it.  It is still bad philosphy no matter how inspired by Wittgenstein.
>

Dennett doesn't deny semantics. That is simply absurd and another example of 
your missing the point. Moreover you don't understand the homunculus issue, 
based on what you have written above. It is about little men inside of little 
men inside of little men, etc., etc. Dennett's proposal, on the other hand, is 
that what we call "consciousness" occurs on a continuum and that there are 
lower levels of it which become increasingly more complex and more recognizable 
as what we call "consciousness" the higher up in the hierarchy of living 
organisms we go.


> Brains cause consciousness by way of physical processes which are not 
> computational processes.  Let Hacker call this proposition nonsense.  Who 
> cares what he thinks?
>

Now there's a great argument. Well who cares what Searle thinks? Who cares what 
you think? Don't you think you can do better than that in referencing Hacker?


> Notice how Stuart will continually invoke the notion of computation as a 
> causal notion when describing Dennett's position--as if Searle's position 
> isn't about nonconscious processes brutally causing consciousness.


Searle's position is self-contradictory.


> So Stuart is constantly trying to see Searle's position as inconsistent with 
> a "process-based" view of consciousness and he does this simply by conflating 
> computational processes with physical ones.


If the issue is about processes and what they can do, then it is no argument 
against computational consciousness to say computational processes can't 
succeed at this because they aren't conscious (don't understand Chinese). Well 
of course they aren't. That's just the point!


>  So, if one denies the first, one eo ipso denies the second.  This doesn't 
> follow.  Then Stuart invokes our ignorance.  I'll cut to the chase:
>


I invoke your ignorance? How so? Are you saying I'm not clearly stating my 
position? I'm trying to pull the wool over your eyes? I'm claiming that our 
ignorance of how consciousness really works supports my thesis? WHAT ARE YOU 
TRYING TO SAY?


> The options are two (main) species of functionalism:
>
> 1. Functionalism with an eliminative thrust (I.e., eliminative materialism) 
> is espoused by Dennett/Kim wherein we "dissolve" a la Wittgenstein/Hacker the 
> question of how the brain causes semantics/consciousness (this is a denial of 
> the second premise wherein it is stated that minds have semantic 
> contents--but many
> aren't that quick to notice..


You are badly mistaken. No one is denying semantic contents. The issue is what 
does this content consist of?



>  And Dennett will waffle at will,


Everyone waffles but you and Searle, eh? Poor Searle, to be stuck with such a 
defense of his position on this list.


> sometimes trying to say true things.  The above is a form of conceptual 
> dualism whereby one is an eliminativist because one finds that the 
> alternative is a nonphysical theory of mind, even though they espouse the 
> doctine only if they do in fact have semantics and
> really mean it..


This is barely coherent. What is "conceptual dualism" and how does it fit in 
with other notions of dualism? And what is it about Dennett's claims that 
qualifies it to be called "conceptual dualism"?



> Jaegwon Kim points out that one needs some form of eliminativism if we are 
> not to have causal overdetermination infecting our theory.  Eliminativism is 
> bewitched by conceptual dualism to the point where it seems impossible to ask 
> how the brain causes first person subjectivity.
>

What is "conceptual dualism", etc., etc.?


> 2. Functionalism with an epiphenomenal thrust is espoused by Chalmers, who 
> claims that there really are minds but they have no causal properties.  This 
> is in keeping with Kim's conceptual dualism even though he may not share 
> Chalmers' epiphenomenalism..  We have our minds in the real world, but they 
> are nonphysical and do no work for Chalmers, including helping him write a 
> book on consciousness.  All theologians are thus served notice that they have 
> been on vacation and are entirely screwed up if they hadn't noticed the 
> heaven that they're already in (looooong story).
>


What has THIS to do with the dispute over Searle's CRA?


> Searle merely claims that both are misguided.
>

But what if his CRA is misguided? Who cares what he "claims"? Make the case for 
the claim or not.


> Stuart continually argues that Searle's critique amounts to a form of dualism 
> while I claim that both species of functionalism noted above are mired by 
> conceptual dualism to begin with.
>

What is "conceptual dualism", etc., etc.?


>
> Immediately after the above quote, Stuart writes:
>
> "> If the issue were that consciousness cannot be sufficiently accounted for 
> by describing syntactical processes at work, then introducing complexity of 
> this type wouldn't matter, of course. But as Dennett shows, we can account 
> for the features of mind by this kind of complexity, at least in a 
> descriptive way (if one is prepared to give up a preconceived notion of 
> ontological basicness re: consciousness)."
>
>
> So the kind of complexity is computational.  Searle just says that the thesis 
> is vacuous.


That's not an argument and, as I pointed out above, what about it is "vacuous"?


>  And to the extent it is not vacuous but is about physics doing the grunt 
> work, it is in keeping with Searle.


Then how is it that Searle thinks he is denying Dennett and Dennett thinks he 
is denying Searle? Have you some revelation that they are really on the same 
side and, since you seem to be claiming you do, howcome Searle hasn't seen it?



> But Stuart wants to paint Searle a different color.  That is because he 
> doesn't care that he is wrong to do so or doesn't understand exactly what 
> Searle's beef is.  And he can't have it both ways.
>

What if Stuart isn't wrong to do so though?


>
>
> Stuart writes:
>
> "Whether Dennett's model is adequate for accomplishing the synthesis of a 
> conscious entity in the real world remains an empirical question."
>
>
> Excuse me while I primally scream.  Okay.  Much better!  The empirical 
> question is how brains do it.  Computationalism is vacuous as such.


So you assert but I've already shown above that it is "vacuous" of you to argue 
by simply asserting that something someone else says is "vacuous". Show what 
you mean, don't just assert.


>  It is not vacuous when one insists that by "computational complexity" they 
> mean physical complexity.


We don't. We mean system complexity. ALL COMPUTATIONAL PROCESSES REQUIRE A 
PHYSICAL PLATFORM. You cannot do anything in the world unless some physical 
activity is happening. You cannot even think about doing anything if your brain 
isn't working properly and so doing what it has to do to produce that thought. 
So physicality is a given, unless and until you start imagining mind as an 
ontological basic.



>  Physical complexity, of one form or another, is the right picture for both 
> Searle and Dennett.


Indeed. But Searle is wrong in his CRA.

(This is way too long, Budd, so I'll skip your Hacker stuff from here on unless 
it is relevant to the CRA dispute.)

<snip>

> Stuart continues:
>
> "But the point is that there is nothing in principle preventing it, as long 
> as we can fully describe consciousness this way."
>
>
> There is nothing in principle which prevents fully describing anything, 
> including ghosts.


You misunderstand what I am saying with "fully describe". The issue is to 
describe something that leaves nothing out. All manner of describing is always 
possible but there is no way to "fully describe" ghosts in this sense unless 
you are delimiting the description in some way (e.g., what author X says of 
ghosts). On the other hand, if we can provide a computational type description 
of all the features we associate with what we mean by "consciousness" then that 
description is theoretically viable whether true or not.


> Some descriptions simply will invoke physical processes, including those 
> which cause consciousness.  It's not that Searle is denying an > empirical 
> possibility.


He most certainly is. See the CRA. Its conclusion is that computational 
consciousness isn't possible.


>  It is a truism that some physical processes cause consciousness, Hacker 
> decidedly notwithstanding along with those who use Wittgenstein as armor for 
> being infected by the nonsense of science..  It is simply vacuous to describe 
> the physical processes as intrinsically computational processes.
>

No one is saying computational processes are "intrinsically computational" 
whatever THAT means. The issue is what computational processes, which 
manifestly exist, can do.


> The upshot is that what you mean by computational complexity is simply 
> physical complexity.


No, I mean system complexity.


> And what Searle means by computation is covered by both serial a nd parallel 
> processing.
>

Then why does he think he is denying what Dennett asserts as his thesis for how 
consciousness happens? Poor Searle, he doesn't even know he is in agreement 
with Dennett on your view!


> If you insist on conflating computation with physics, then you can join Eray 
> in critiquing a philosopher he can only misinterpret.
>

Not worthy of a reply!


> I'll comment some more below.
>
>

More of the same repetitive stuff I suppose?


>
> Stuart writes:
>
>
> "So everything hinges on whether Dennett's account of consciousness as a 
> certain agglomeration of features is credible.
>
>
> Marsha, Marsha, Marsha!  (I mean, Dennett, Dennett, ah, Parrot)  The way you 
> spell things, there's no diff. between Dennett and Searle.  The way you spell 
> things, there is.


Have you informed Searle yet?


>  The reason for your contradiction is your conflation of physics and 
> computation on the one hand,

All computational processes (algorithms running on computer) are physical. 
Pretending otherwise is what's strange.


> along with your insistence that, physically speaking, parallel processing 
> seems more realistic as a theory of how the brain causes consciousness than 
> serial processing.  Computationally speaking,
> anything that can be computed in parallel can be computed in serial.


More repetition. See my replies above.

> Earlier you said that this misses the point. So the alternative is for you to 
> think that there is a physical difference between the two.  Well, nothing is 
> intrinsically a computation and that is why it doesn't matter for you also to 
> think that serial processing may also be viable.
>


No one is talking about being "intrinsically a computation". What IS your 
problem?


> You're just a mess.

You are annoying me again, Budd.

>  And maybe on purpose.  I would like to think you know better.  But assume 
> you are really all wet in your understanding of just what functionalism may 
> be and just what Searle's real drive is about, it's no big deal because you 
> are just a centimeter away from saying "Oh, I've been posilutely goofy when 
> it came to understanding Searle."
>

> Stuart writes:
>
> > To dispute Dennett you have to say his account doesn't fully describe all 
> > the features that must be present. Searle attempts this with his CRA but 
> > his attempt hinges on a conception of consciousness which requires it be 
> > irreducible (i.e., already assumes Dennett's model is mistaken at the 
> > outset) -- and yet even Searle doesn't stand by this with regard to brains, 
> > thereby putting him in self-contradiction.
>
>
> I must assume you're all wet then,


I see you still don't understand. All right, I didn't really think you would as 
you haven't over the past five lists. Why expect miracles now?


> but just by that centimeter remember.  Notice that the CRA derives from the 
> CR which derives from the target article.  In the target article in BBS he is 
> showing that a serial computer (or any UTM which can serially compute 
> anything computable in parallel!!!!) will give false positives and thus a 
> computational theory of mind can't give necessary and sufficient conditions.
>


His argument, the CRA, is not about whether the Turing Test is a reliable test 
of consciousness but about whethercomputers can be deemed to be conscious even 
if they pass it.


> Well, that's a long way from being in contradiction with a physicalist 
> thesis!  But Stuart may just be pretending to be all wet.  Or he's a 
> centimeter away from learning something.  Again, it's no biggie.
>
> Cheers,
> Budd
>
> Ps.  I snipped but will reply that our discussion six years ago was not at 
> the Wisdom forum.  It was at philosophy_and_science_of_language.
>
>

Oh, right. I met the founder of that list on the Wisdom Forum. There have been 
so many and most really pretty poor I'm afraid. Anyway, it would be better in 
future, Budd, if you will not repeat so much. These posts can become long 
without endlessly making the same points over and over again. Oh and try to 
keep your little insults to a minimum as I am quickly remembering why I gave up 
even wanting to respond to you.

SWM

=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: