[Wittrs] Is Homeostasis the Answer? (Re: Variations in the Idea of Consciousness)

  • From: "iro3isdx" <xznwrjnk-evca@xxxxxxxxx>
  • To: wittrsamr@xxxxxxxxxxxxx
  • Date: Wed, 03 Feb 2010 20:09:36 -0000

--- In Wittrs@xxxxxxxxxxxxxxx, "SWM" <SWMirsky@...> wrote:


> Is this picture really all that different from Dennett's proposal
> that brains run processes in the way computers run algorithms?

Here's a quick comparison.  I'll use "A:" to prefix the AI/Dennett
view, and "N:" to prefix my view.

Information

A: Information is a naturally occuring part of the world, and  is picked
up by sensory cells.

N: Information is inherently abstract, so does not exist apart  from its
construction and use by humans (or other cognitive  agents.  We interact
with the world in order to construct  information, and we use sensory
cells in that interaction.

Core functionality

A: Computation/logic, applied to the information picked up by  A:
sensory cells.

N: Information gathering, which I shall loosely refer to as
"measurement".

Starting point

A: Most AI people assume large amounts of innate knowledge or
structure, perhaps in the form of a program and a data base (often
called a "knowledge base").

N: Self measurement of internal states.  The system can be said to
have, as innate purposes, the maintaining of internal states within
innately prescribed limits.  Among those innate purposes is a drive  to
explore ways of interacting with the world, including ways of  forming
information about the world.

Learning

A: The usual AI view of learning is one of discovering patterns  within
the input that is picked up.  There is also some consideration  of
reinforcement learning.

N: Learning is acquiring behaviors which tend to promote the  ability of
the system to meet its purposes.  With each new behavior,  there is an
accompanying new measurement system for self-measuring  performance in
carrying out that behavior.  Of particular importance  are behaviors
that provide ways of forming information about the  external world - we
can refer to that as discovery/invention of new  ways of measuring.
Note that this could be described as perceptual  learning.

N: With each new way of measuring, there is an associated new  concept
(that which is measured).  With each new self-measurement  associated
with new acquired behaviors, there is a new purpose of  carrying out
that new behavior appropriately.

Intentionality

A: The usual AI view is that there is nothing more to intentionality
than attribution.  That is, there is only derived intentionality.
Dennett argues for that in his "The Intentional Stance."

N: The initial self-measurement of internal states, and the consquent
initial purposes, are perhaps best considered to be examples only of
derived intentionality.  However, the new measuring systems created  by
the system itself are best considered to be examples of orginal
intentionality.  In particular, information about the world that  is
formed on the basis of these acquired measuring systems should  be
considered intentional information.

Free will

A: The behavior of the system is determined by the input and the
mechanistic rules it is following.  The system is free to choose  only
in the compatibist sense that it is free to accede to doing  what the
mechanism dictates that it shall do.

N: Free will is the ability to make pragmatic choices.  The options  are
evaluated according to the systems purposes, and a choice is  made in
accordance with those purposes.  Note that there might be  several
relevant purposes and some of them might be in conflict.

Regards,
Neil

Other related posts: