[Wittrs] Is Homeostasis the Answer? (Re: Variations in the Idea of Consciousness)

  • From: "SWM" <SWMirsky@xxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Thu, 11 Feb 2010 01:36:29 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, "iro3isdx" <xznwrjnk-evca@...> wrote:
<snip>

>
> We disagreed on what we mean by "mechanism", so presumably we  disagree
> on what we mean by "machine."  It's not that "something  more" is
> happening.  Rather, it is that something different is  happening.
>

Perhaps. My reference in this case was to a device constructed by beings like 
us. While I agree with Searle that the brain can be called a machine and I do 
think that "mechanism" is not limited in its reference to manmade mechanical 
devices, these considerations aren't relevant to the particular use which, I 
believe, you were responding to here. Thus, I can't imagine that we are 
actually in disagreement about THAT use (that a "machine" is a manmade 
mechanical device designed and built to perform certain operations).


> I see a mechanism as a system that follows rules by application  of
> brute causal force.  If I throw a brick into the mechanism,  either the
> brick will be smashed to small parts, or the mechanism  will jam up and
> fail.
>
> If I throw a brick into a stream, the stream keeps flowing.  It  just
> goes around the brick.  It adapts to changing circumstances  in a "go
> with the flow" kind of way.  I see that kind of adaptive  behavior as
> very different from mechanistic behavior.
>

On the use of "mechanism" I have invoked, the movement of the stream involves 
some mechanism, too, in this case the way(s) in which the molecular 
constituents of the stream operate at a deeper level. But one could also speak 
of a stream's mechanism in more macro terms, e.g., by referring to its 
behavioral tendencies.

At the deepest level there is no reason to think there aren't physical 
operations (which may not even look like the operations we can observe at our 
level) that underlie and determine the behavior of things like streams and 
cells (which you bring in below).

But this seems to reveal a fundamental difference in our viewpoint as I am 
willing to consider the mechanistic underpinning of these observed 
entities/phenomena while I am sensing here that you are always going to want to 
come back to a picture that presumes such a mechanistic description irrelevant.

I would say you are really importing the view we have of the phenomena in 
question on the macro level to a more micro level (the realm of atomic parts 
and behaviors). That, I would suggest, is a mistake because it presumes 
irreducibility in order to argue against the possibility of a mechanistic 
reduction. It looks to me to be an argument by stipulation, by fiat.


> A computer is a sophisticated complex mechanism that can give a
> mechanistic solution to very complex tasks.  A biological organism  is a
> complex adaptive system that can do complex tasks adaptively  and
> without being based on causal rule following.
>

As Edelman points out (though he argues that brains have way more complexity 
than computers can ever have -- a spurious argument for the case he aims to 
make, in my view), at a deep level, brains are driven by the genomic blueprints 
found in our DNA (a complex coding that he believes is far more complex at its 
core than the binary coding of computational technology). If brains are like 
that (and there is plenty of reason in biology to think they are), then the 
fact that they are, as you put it, "adaptive" can be explained as a function of 
their complexity rather than as a basic function that differs qualitatively 
from the causally driven behaviors of traditional computers. And this, in the 
end, IS the point of Dennett's argument. As you may recall from that section of 
Consciousness Explained that I transcribed onto this list in response to a 
challenge from Joe, Dennett writes "complexity matters." This is fundamental to 
his thesis and the way he explains how what you are calling "adaptiveness" 
arises.


> Sure, one can have a computer system that is pseudo-adaptive.  It is
> programmed to do many complex tasks and appears to be adaptive for  the
> circumstances for which it is programmed.  But a true adaptive  system
> is not following any strict causal rule system, and can more  readily
> adapt to a wide variety of circumstances for which there  was no
> preprogramming.
>

Given the tremendous complexity of the physical world and of our physical 
selves (including our brains), what reason have we to think that at a deeper 
level we aren't causally determined, too? By "causally determined" note that I 
don't mean to suggest a condition where what we mean by "freedom" in our 
ordinary usages is precluded. Indeed, sufficient complexity would likely make 
us free enough, to all intents and purposes, since outcomes would be beyond 
prediction even if they are determined by physical forces because, of course, 
we could not know in advance how things will turn out even if physical factors 
(in the broad sense of physical) causally underlie appearance.

I think by focusing on "adaptiveness" (the mechanism you want to implicate as 
how homeostasis produces consciousness then?), you are projecting behavioral 
characteristics from the observational level of ordinary human operations (the 
macro level) to the micro level of the atomic and sub-atomic world. Perhaps, at 
that micro level, the usual rules we see at our level do break down though (as 
quantum theory proposes).  Still it would be what happens at that level, 
however we want to characterize it, that would drive and determine what happens 
at our observational/operational level. Supposing that the "adaptive" behaviors 
we see in cells or waterways are the same sort of thing ("adaptive") at the 
micro level strikes me as the mistake here, even though it is possible that 
something like "quantum mechanics" is not "mechanics" in the usual sense of the 
latter term. Still it would be a kind of "mechanics" and discovering the 
dynamics of operation would be important to understanding how behaviors are 
manifested by phenomena at increasingly "higher" levels.


> The logic chip is, in some sense, an elemental mechanical system.
> Similarly, the homeostatic process is an elemental adaptive system.
>

If the homeostatic system's adaptive behavior is a function of the operating 
mechanics of its constituents, which is hardly an unreasonable supposition 
given what we know of chemistry and physics, then there is no reason to presume 
that "adaptiveness" is a stand-alone or otherwise basic competitor of "caused 
behaviors". Indeed, it could be as easily described, maybe more reasonably 
described, as an outcome of particular causal interactions.


>
> > Thus far it seems to me that those who, like Searle, insist on the
> > first person picture over everything else simply have no real answer
> > and make no attempt at a real answer.
>
> I agree that Searle has no real answer, and no great interest in
> finding one.  He thinks that's the job of scientists.  But it seems  to
> me that Dennett has no real answer either, though he thinks that  he has
> one.
>

Yes, here we differ though I want to stress that I am not claiming that Dennett 
is arguing for the truth of his thesis. It seems to me that all he is doing is 
saying THIS is a better way to account for the features we associate with 
consciousness and, if so, it remains to be refined, implemented and tested in 
an appropriate scientific regimen (such as the right AI project), subject to 
feedback from such an inquiry and any needed adjustment.

Yes, Dennett thinks his answer is right but then who, having developed an 
answer in which he or she has confidence, doesn't think he or she is right? It 
would be odd not to! Dennett, of course, does see his role as complementary to 
that of the scientists and, indeed, as spilling into the scientific domain on a 
theoretical level. Searle is more clearly the philosopher in these matters, 
i.e., he stands apart and invokes argument and logic to demonstrate the truth 
of his positions.


>
> > Hawkins view is that each neuron performs a fairly simple, repetitive
> > algorithm but that when organized together in complex arrays, they
> > work in unison to produce the complex pictures of the world that
> > we actually get from the inputs we receive all the time.
>
> I would suggest that a neuron is adaptive, rather than algorithmic.
> That is, it is not actually following any rules.
>

In keeping with what I've already said, it seems to me that the distinction you 
are making is wrongheaded. Whatever is adaptive is so because of its underlying 
mechanisms which are describable as algorithms (sets of procedural steps). Not 
all algorithms are matters of formal calculation (though it may well be the 
case that all are reducible in some fashion to such a way of expressing them). 
Thus, for a neuron to be "adaptive" is for it to have a certain set of 
mechanisms that are expressed in what we would call adaptive behaviors.

>
> > I would ask the same question: what are boundaries?
>
> To a first approximation, a sharp transition in the signal received
> when scanning crosses that boundary.
>
> Regards,
> Neil
>
> =========================================

Isn't sharpness relative along with most everything else? Where we draw the 
line may depend on lots of factors including sensitivities, contexts, etc.

Anyway, and in keeping with my question, is the breakdown of the underlying 
relations, relative to how we get consciousness, that you want to give the 
following then:

Homeostasis produces Pragmatic Selection produces Perception produces 
Adaptiveness produces Consciousness?

SWM

=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: