[Wittrs] Is Homeostasis the Answer? (Re: Variations in the Idea of Consciousness)

  • From: "iro3isdx" <xznwrjnk-evca@xxxxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Sun, 14 Feb 2010 03:47:40 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, "SWM" <SWMirsky@...> wrote:


> Anyway, you initially said that the missing piece, the reason AI
> can't conceivably succeed in producing conscious intelligence,
> was that it lacked homeostasis.

I don't think I actually said that.  What I did say, was that I  ran
into stumbling blocks when investigating how AI could solve the
problems, and homeostasis turned out to be able to get past those
stumbling blocks.  While I am skeptical that AI (as computationalism)
can succeed, I don't have any proof that it cannot.


> You've suggested a number of things in the course of our exchanges,
> the most recent being adaptiveness or adaptation, but you've recently
> said that none of the suggested intermediate steps (pragmatics,
> perception, adapatation) form a direct 'line' from homeostasis
> to consciousness.

I am looking at things, looking at the problems that a cognitive  agent
must solve, in a very different way from that assumed by  most AI
people.  It has been hard to explain the differences and  the reasons
for them, because we start talking past one another at  that point.


> For the record, and just to reiterate, what I mean by "consciousness"
> is that array of features we discover in our own subjective
> experience (our mental life) that we associate with being conscious,
> having a mind. Included among these are:


> awareness
> understanding
> remembering
> thinking
> feeling
> perceiving
> intentionality (aboutness)
> intentionality (having purposes)

My approach seems to cover those.  I am writing up something at the
moment, and I'll email you about it when I have filled in enough  of the
details.

Regards,
Neil

=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: