[Wittrs] Is Homeostasis the Answer? (Re: Variations in the Idea of Consciousness)

  • From: "jrstern" <jrstern@xxxxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Thu, 04 Feb 2010 03:46:08 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, "iro3isdx" <xznwrjnk-evca@...> wrote:
>
> > But the issue that I am addressing and have always been
> > addressing, even when asking your for explication of your
> > reason for thinking AI is on the wrong track, is not what
> > cognitive agents do but how they come to be in a world chock
> > full of apparently inanimate things.

Is this a special case?

Do you worry about how cats come to be in a world chock full of non-cats?  Hot 
things in a world chock full of cold things?


> But you only see that as a puzzle because of how you are looking
> at it. Even the most primitive biological organism has more
> intentionality than a computer will ever have.  Sure, an amoeba
> or a plant is not conscious. But it is still very different from
> computers.

Different, or more?

Even Searle says computers have intentionality -
derived intentionality.

So does Fodor, btw, not derived but vanilla, but only by
invoking his dual-aspect business.


> You are trying to look at every thing as mechanism.

Perhaps mechanism itself is more than previously assumed.

One has to watch all the Wittgensteinian issues here, about
reifying every word.


> Whether or not it is based on particular abstractions is not what
> matters.  The learning method proposed for AI is wholly inadequate
> to account for human learning.

Fodor, not to mention Chomsky, assumes innate factors.

Myself, I haven't worried much about learning, I'm still trying
to figure out how something once learned or innate,
could possibly work anyhow.

Josh



=========================================
Need Something? Check here: http://ludwig.squarespace.com/wittrslinks/

Other related posts: