--- In
Wittrs@yahoogroups.com, "iro3isdx" <xznwrjnk-evca@
...> wrote:
> --- In
Wittrs@yahoogroups.com, "SWM" <SWMirsky@> wrote:
>
> > On the use of "mechanism" I have invoked, the movement of the stream
> > involves some mechanism, too, in this case the way(s) in which the
> > molecular constituents of the stream operate at a deeper level. But
> > one could also speak of a stream's mechanism in more macro terms,
> > e.g., by referring to its behavioral tendencies.
>
> I think you are mostly confusing yourself here.
>
I suppose it wouldn't be the first time!
> The point is that we make our machines to follow our rules, and to
> resist external influence. And sure, the resistance is not unlimited,
> and a strong enough external influence can change it. So trains can
> derail, but not easily.
>
> Adaptive things are far more sensitive to small changes in the
> environment.
>
My point is that "adaptive" is not a basic function in the way causal relation is. I think your confusion is in somehow equating the two and then saying adaptation is the thing that's needed. But adaptation can probably be better explained as a function of causal complexity. If so, it cannot solve your problem.
> In some sense, we can be conscious to our world because we are
> sensitive to small changes in our world. The computer is unconscious,
> and essentially solipsistic, because it is largely oblivious to small
> changes in the world.
>
Again, the point of a Dennettian type model is to say that complexity enables computers to rise to the level of behavior we find in living organisms. It remains to be implemented and tested of course. But there's no sense arguing against the possibility on the grounds that living organisms operate differently than the current crop of computers. Of course they do! The issue is whether there is something about living organisms that can be replicated in computers.
>
> > If the homeostatic system's adaptive behavior is a function of
> > the operating mechanics of its constituents, which is hardly
> > an unreasonable supposition given what we know of chemistry and
> > physics, then there is no reason to presume that "adaptiveness"
> > is a stand-alone or otherwise basic competitor of "caused behaviors".
>
> I'm not sure what point you are making there. I have never suggested
> that homeostatic systems are exempt from causation.
>
>
If homeostasis leads to adaptation as you suggest and causality lies at the bottom of homeostasis, then there's no reason to suppose that causally driven computers cannot also achieve the kind of behavioral adaptationism that living organisms achieve.
> > In keeping with what I've already said, it seems to me that the
> > distinction you are making is wrongheaded. Whatever is adaptive is
> > so because of its underlying mechanisms which are describable as
> > algorithms (sets of procedural steps).
>
> I challenge you to accurately describe the adaptiveness in terms of
> algorithms.
>
The issue is to develop algorithms that enable adaptation. You have a conception of this that seems to hold that a computer can only be built to do exactly what is programmed into it and that does pretty much describe what we expect of most computers today. But the point of AI is to develop algorithms that do, in fact, adapt. Minsky has a whole slew of proposals in The Emotion Machine. Hawkins says the way to do it is to implement a relatively simple algorithm in a chip and then combine these chips in a complex array along the lines of how neurons are arrayed in brains. In either case algorithms are at the bottom of what is meant to be achieved.
You are arguing that efforts like these are foredoomed because algorithmically driven processes lack the capacity to adapt. But the point is to look at Dennett's model and note that it hinges on complexity. ("Complexity matters", he writes.) A sufficiently complex system would have the modular tools to deal with unanticipated inputs in new ways. We have seen on the Analytic list (I forget the exact actual reference unfortuntely) how at least one writer argues that introducing parallelism (as Dennett envisions to achieve the necessary level of complexity) introduces uncertainty, the possibility of new (unplanned for) outcomes.
If living systems are algorithmic at a genomic level too then even their adaptational capacity is causally grounded.
>
> > Anyway, and in keeping with my question, is the breakdown of the
> > underlying relations, relative to how we get consciousness, that
> > you want to give the following then:
>
>
> > Homeostasis produces Pragmatic Selection produces Perception produces
> > Adaptiveness produces Consciousness?
>
> No, that's far too simplistic. Homeostasis provide a way of making
> pragmatic judgments, but is not necessarily pragmatic on its own
> account. Pragmatic judgment provides a way of making the decisions
> needed to construct a perceptual system, but pragmatic judgment does
> not necessarily lead to perception. Perception is a requirement for
> consciousness, but perceiving systems are not necessarily conscious.
>
> Regards,
> Neil
>
> ============
=========
=========
=========
==
So what is the feature that produces what we recognize as consciousness? The last element you gave us was "adaptation"
. Is that the critical feature or step? If it isn't what is and how does it relate to the underlying importance of homeostasis which is what you originally told me was the key?
SWM
============
=========
=========
=========
==
Need Something? Check here:
http://ludwig.squarespace.com/wittrslinks/