[Wittrs] Is Homeostasis the Answer? (Re: Variations in the Idea of Consciousness)

  • From: "iro3isdx" <xznwrjnk-evca@xxxxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Thu, 04 Feb 2010 03:28:37 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, "SWM" <SWMirsky@...> wrote:


> But the issue that I am addressing and have always been addressing,
> even when asking your for explication of your reason for thinking
> AI is on the wrong track, is not what cognitive agents do but how
> they come to be in a world chock full of apparently inanimate things.


But you only see that as a puzzle because of how you are looking at it.
Even the most primitive biological organism has more intentionality than
a computer will ever have.  Sure, an amoeba or a plant is not conscious.
But it is still very different from computers.


You are trying to look at every thing as mechanism.  And in a way, the
greater puzzle is that you (and others) would do that.  As best I can
tell, the only actual mechanisms that exist are human artifacts. So this
whole idea of trying to reduce everything to mechanism seems foolish.



> Yes, but the issue at hand is how do we get these kinds of sentient
> agents, that is entities with a subjective point of view, entities
> that experience. It's not what THEY do but how they come to be that
> AI, and cognitive science generally, addresses.


Yet it seems to me that AI and most of cognitive science make little
effort to address that question.  What I more commonly see is people
trying to explain away that question, and to convince themselves that it
is all mechanism.  But, if you look around the world, the only
mechanisms are our own created artifacts, so it seems foolish to try to
explain everything as mechanism.



> That is, I asked you to explain what you once told me on another
> list, that the key to understanding how minds come to be (not
> came to be as in evolutionarily history!) is in understandinding
> the homeostasis of living systems (which, presumably, computers
> don't have).


It seems that I cannot explain it.  You do not recognize the existence
of the kind of problem that homeostasis can solve, and I have been
singularly unsuccessful in my attempts to introduce you to those
problems.



> At least Dennett has an account, whether one chooses to say it can't
> work because it is premised on the abstraction of computational
> programming or not.


Whether or not it is based on particular abstractions is not what
matters.  The learning method proposed for AI is wholly inadequate to
account for human learning.


Regards,
Neil


=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: