[Wittrs] Re: On the Mechanism of Understanding

  • From: kirby urner <kirby.urner@xxxxxxxxx>
  • To: Wittrs@xxxxxxxxxxxxxxx
  • Date: Fri, 14 Aug 2009 16:35:27 -0700

On Fri, Aug 14, 2009 at 1:39 PM, Stuart W. Mirsky<SWMirsky@xxxxxxx> wrote:
>
>
> --- In Wittrs@xxxxxxxxxxxxxxx, kirby urner <kirby.urner@...> wrote:
>>
>> >> telepathy
>> >> (Ã¥ ) å¿Æ'é ˆæ„Ÿæ‡‰, 傳å¿Æ'è¡" (getting this
>> >> Unicode OK?)
>> >>
>>
>> Yikes.

Yikes again!  Looks even worse on this next pass!

>> And I'm saying that whole line of approach is a dead end based on
>> philosophical confusions, that so-called "cognitive science" as
>> represented here by Edelman and Hawkins is little more than
>> refurbished mind-body dualism from the Victorian era, popular in some
>> ethnic neighborhoods, mostly anglophone.
>>
>
> So you are "saying" that there is no possibility that science could someday
> produce a machine that has consciousness, has a mind?
>

We're both ostensibly English speakers, but when you get two
philosophers talking, that's no guarantee of any kind of shared
understanding I'm afraid.

I have a long track record of doing little diagrams like this, going
back to East Side Bus Terminal days, where I specialized in security
and logistics:

[ mind principles angle ]
--
[ brain specialcases frequency ]

i.e. I've already got those words in play, don't wanna force myself
onto new tracks with 'em, obey rules made up by people I don't
consider as authoritative or qualified.

> And this is because why?
>
> Minds are special and stand apart from what is physical (Chalmers,
> Strawson)?
>
> Brains aren't computers (Edelman, Searle)?
>
> Minds aren't based in physical processes (Searle, though he doesn't quite
> admit this because he acknowledges minds are produced by brains[!] though he
> is never quite willing to hazard a guess as to how)?
>

I'm not inclined (biased) to use "mind" in the plural very often.

I'm thinking of brains as one half of a phone conversation
(environment the other half) with the goal being to preserve some kind
of homeostatic / dynamic integrity, the doing of which requires
continual fine tuning, upgrading of ones thinking, adjustments in
meaning.

"Brain in body, body in mind" is what some people say.

I'm sure that sounds like idealism to your ears, but for me it's more
Gregory Bateson in flavor (Steps to an Ecology of Mind) i.e. "mind =
ecosystem" (many brains).  [see below re ecology, forest floor etc. ]

"Holy Ghost" (mentioned in my lightning talk on the Mite) doesn't
sound scientific enough to most people (two strikes against it:
"holy" and "ghost") but the word "Zeitgeist" has street cred even in
Springer-Verlag circles.

A translation of telepathy from Chinese might be "attuned to the zeitgeist".

Various people around the globe were obsessed with inventing
television around the same time.

Moods swings take over a whole nation, a stadium, a mob.

These group or community phenomena have to do with "mind" in my
language whereas I said earlier I'd be OK with an individual
"consciousness" that a single brain is involved with.

I did some work to tease apart i.e. "mindful" versus "conscious in a
medical sense" -- work I'm going to build on, not undo.

One may be conscious, yet mindless (as when in a blind panic).

You might say that's not your point, that you're talking about "minds"
in a more conventional sense the way Edelman and Hawkins talk about
'em, i.e. in the (one true) scientific way, the only reasonable
sensible way, the way that any sane thinking person must accept if
wanting to participate in civil discourse with right-thinking others.

I'd counter that the "cog-sci way" of talking (about minds) is far
enough off the beaten path in one direction, that I feel I owe no
apology for branching equally far in another direction i.e. neither
cog-sci nor my stuff is all that "mainstream".

Nor am I willing to concede that Gregory Bateson was less of a
scientist than Edelman say, just because he talked about an ecology of
mind.

Regarding "conscious machines" I don't know for sure how future
generations will choose to talk about their inventions, so lets just
leave it at that:  I don't know.  Some subcultures or ethnicities may
already talk in these terms for all I know.

Even today though, I'd go ahead and grade qualities of consciousness,
like a butcher grades meats.

Some brands of consciousness have a lot of fat in them.
Wittgenstein's was lean and mean (yummy).

In this language game (which I just invented), it's OK to "eat" (as in
grok) another consciousness, but it's a kind of "copying" (not an
aggressive act, more like reading a book, not damaging to
consciousness we consume (feed off, get a charge from)).

Will machines ever have minds?  No, because I don't really think of
humans as "having minds" either. Likewise they *are* conscious more
than they *have* consciousness.

Brains are not conscious, people are (other animals, sentient beings).

People are mindless versus mindful.  Maybe use that spectrum to grade
the level of consciousness they express.

Two axes then, with the mind axis (Y) reading out the level of
consciousness (X).  The medical axis for sleep to woozy to semi-awake
to sharp could be a 3rd axis yet (Z).

So picture an XYZ cube that's { awake * conscious * mindful } (each
edge like a ruler, dot moving around inside, changing position in this
"phase space", a very traditional way of thinking here).

Humans partake of a shared collective Unconscious (Jung) or super-Self
or zeitgeist (noosphere -- de Chardin), and their brain-hosted
consciousness may be graded on its ability to surf in that shared
cyberspace (steering space -- cyber = steer).

Really talented advertisers with a strong handle on the collective
psyche are like top grade "beef brains" whereas your average
cave-dweller in a cube is maybe low on the totem pole where
consciousness is concerned (an anemic metaphysician).

Machines already serve as consciousness amplifiers, i.e. brains not
assisted by prosthetic devices such as the Internet and television are
"dim bulbs" relatively speaking, vis-a-vis various measuring criteria
(which criteria?).

On the other hand, someone cosmopolitan and highly mindful, left on a
desert island, may break down completely in short order, start talking
to a soccer ball (Wilson!... http://www.imdb.com/title/tt0162222/ ).

> Do you want to say where the "dualism" is in a notion that minds are not
> synthesizable? Isn't the reverse view (for any of the reasons shown
> immediately above) more consistent with a claim of dualism?
>

I don't want to learn to use the word "mind" the way you've self
schooled yourself to use it.  We're in different schools.  Life is
short.

That's what's fun about lists like this, we meet exotic people who
speak in quasi-unintelligible ways, or in ways I can't believe in.

>> Or put it a different way: there's a family resemblance in usage
>> patterns, among folks who deploy words from the English dictionary to
>> printed media, but if you do more than scratch the surface, you'll see
>> we're more like composers playing the guitar differently, even if
>> using the same chords.
>>
>
> Does that mean computers might be an appropriate platform for a mind?
>

In your book, maybe so.  I'm not going to back seat drive how you
talk, as I'm not in your mental car.

However I will keep insisting I have a right to share the road and
won't concede that Edelman or Hawkins are any more "scientific" just
because their brand of science fiction is more popular in some zip
code areas, with some ethnic minorities.

> There's certainly a role for interconnectedness but I think there's no
> reason to believe a person in isolation would cease to be able to think
> coherently after a certain period of time.
>
>> Most Robinson Crusoe types don't find a man Friday and become
>> dysfunctional within weeks, because not connected to other brains (the
>> only way they're able to work effectively). Untouched and unloved
>> babies die or fail to mature.
>>
>
> Babies are one thing as there are developmental issues. But adults quite
> another. What evidence do you have that most Robinson Crusoe types "become
> dysfunctional within weeks"?
>

I agree with you that an adult with many years of schooling is going
to decay (break down) less quickly than a bunch of boys might.

A self-disciplined adult might do fine for a long time, only lose it a
little, whereas the boys might end up... well, you read the book
probably, or saw the movie.

http://www.imdb.com/title/tt0100054/

http://www.youtube.com/watch?v=Pd_laqDfiOk
(one boy one brain, two boys half a brain, three boys no brain)

>> A single ant does not explain the "thinking" of the ant colony, any
>> more than a single brain explains the "stream of consciousness" we
>> associate with civilization -- it really does take a village, at
>> least.
>>
>> Gotta have TV to have Kirby-style or Katie-style consciousness, brains
>> alone won't ever do it.
>>
>
> That's too bad!
>

Oh I dunno.

The quality of one's consciousness awareness is a product of the times
and civilization, twas always thus.

I'm a geek and need a lot of technology to sustain my stream of
consciousness at the level to which I'm accustomed (relates to living
standards).  You could transplant me to a small village with no
Internet, but I'd have to undergo a rather severe change in
consciousness in that case, might go banging on the shamans door
asking for ER services (got DSL?).

I admit I have dependencies.  I'm not equally functional in all contexts.

Put me on a desert island and I'm probably dead meat pretty quickly,
unless there's a resort hotel in the vicinity.  Give me some relevant
training first, and I might last a lot longer.  I'm a pretty fast
learner, but "desert island living 101" hasn't percolated to the top
of my stack at any point.

>> There's a large body of mythology developed around this notion of
>> "consciousness" and how it relates to "the brain".
>>
>
> There's also a large body of theory. What makes one man's theory another's
> mythology? And how do we know when a theory is really a theory and not just
> a mythology and vice versa?

These are good questions worth a lifetime of study.  Have at it!

>> You think it's me with the confusions. That's OK with me. I think
>> you have little choice given your allegiance to those science fiction
>> writers you favor.
>>
>
> I don't read science fiction and haven't since high school which, for me,
> was a long time ago!

See, we operate in different namespaces!  Good example.  Great
dramatic foiling, great dialog.

Excellent!

<< applause machine >>

You've been quoting all this science fiction, insisting its science.
Neither of us will "bend our own rules".  That's fun.

>> Sounds like, but I call it a bias, not a prejudice.
>>
>
> Bias is in favor of, prejudice the opposite.
>

For me, bias is more like Wittgenstein's "inclination".  I can be
biased for or against whatever.

Prejudice means "pre judge" as in "reflexively without thinking".

Remember how Wittgenstein says language requires, queer as this may
sound, require agreements in judgments?

This is what he was talking about, right here!

>> I'd have to "suspend my disbelief" to get into the mood
>> for such thinking.
>>
>
> In the end it's a scientific question. Simply niggling over the language
> doesn't change that.
>

I'm not sure it's a scientific question.  Might be more a matter of
who manages to recruit the most talented students, a kind of
competition.

"Niggling" isn't what people customarily do around language when deep
schisms emerge (grammatical divides).

A contradictory discourse, in a small village, may result in
banishment, bodily harm to the witch in question.  That's not just
"niggling".

People go to war, spill blood, over whether God is a Trinity or just
the one Dude, ridiculous as that may seem to civilized thinkers.

Remember how different grammars boil down to different forms of life
in Wittgenstein, not justified by reasons because reasons come to an
end?  You remember that part of our lore yes?

Could there be a science that only makes sense within the confines of
a specific geographic area, such that outside that region there's
simply no point even practicing this science?

That sounds wrong at first, but then its true of many lab sciences.
Take away the lab and you lose the quality of "consciousness" you'd
need to continue with the work (the tools are gone).  Same as with
depending on television.  Mindfulness would be the variable here
(you're still just as medically awake, but you've been disempowered,
like put in a zoo cage, imprisoned).

When Applewhite surrendered has vast filing system to Stanford, he
said it was like losing half his brain.  From a medical science point
of view, I'd say this was more than a mere metaphor (it's a figure of
speech with biological meaning -- take a scholar away from her books,
and you've committed quasi-bodily harm, hurt her consciousness, might
be experienced like a kick in the stomach (a knot of anxiety)).

It's not weakness to admit dependencies, but honesty.  Making "the
brain" carry all the weight is unfair.

To think the way we do, to have these trains of thought such as here
on this list, we need a sprawling civilization, vast libraries,
computers.  The individual brain should not be saddled with hosting in
the sense of sourcing an entire civilization!  It's a myth that we
each carry enough "mental DNA" to replicate what we have now, were it
to be taken away from us for some reason.

The brain may *use* what it takes in, but it didn't *create* the
language its using (co-developing our language took more time than it
took to build the pyramids, the cathedrals) -- if we wanna talk that
way at all (kinda quirky, like since when did brains use language --
next thing you know we'll have "talking toes").

http://www.youtube.com/watch?v=6o7pk0DP9BY  (I really love Youtube, a
dream come true for me)

>> It'd be fun to write an AI-bot that spits out reams of "cog-sci" of
>> the type these guys write. Wouldn't be that hard.
>>
>
> You've read Edelman and Hawkins then?
>

Just the quotes here.  Sampling.  Lots of CDs in the CD store so can't
always do lotsa tracks.

>> Let me be an arrogant bastard from Princeton, Rorty my thesis adviser
>> (wrote on Wittgenstein): I understand Wittgenstein very well, much
>> better than most of my competition.
>>
>
> And yet, as you rightly point out, we are often not the best judges of our
> own competence. But I take it you are invoking your thesis adviser's
> authority here as certification.

In part that, but then I understand Wittgenstein better than Rorty did
(strut puff).

I'm not the best judge, you're right.

> Malcolm actually studied with Wittgenstein and was a good friend of his and
> yet I thought he got certain things wrong. Waismann worked directly with
> Wittgenstein and yet Wittgenstein thought he got him wrong, too.
>

Dr. Haack and I were making fun of these first generation
Wittgensteinians over dinner awhile back.  They'd all stroke their
chins and gesture the way he did, ape his way of talking (OK, not
all). She knew Rorty personally as well.

> Are you saying Wittgenstein would have claimed, as Searle does, that it is
> impossible to reduce mind to non-mindlike constituent processes as part of a
> scientific analysis of what brains do to produce minds?

There's no rule or law that compels me to render a judgment contrary to fact.

Given Wittgenstein had no control over the design of this cog-sci
language game, he shouldn't be required to run in that maze.

That'd be like making up a new board game on a 64-square board, then
phoning a chess master to break a tie.  Chess master:  "this isn't my
game, I don't have to decide."

That sounds like a cop out, but it's the best I can do under some time
pressure.  I'm on the clock.

>> You've dismissed responding re telepathy I notice.
>> I assume that was
>> intentional and not just an unthinking reflex or the result of mental
>> blinders.
>>
>
> Yes.
>

> Later that evening my wife took me aside and said the psychologist had
> confided to her that he was sorry to say that I really didn't know the first
> thing about Wittgenstein but he didn't want to embarrass me at the table. I
> guess we often aren't the best judges of our own competence, as you noted.
>

Funny story.

In my little world, everyone knows I'm a Wittgenstein authority,
that's never questioned.  That doesn't mean I know his bio as well as
I should.  When is the movie coming out so I can sneak in and back
fill some embarrassing holes in my data bank?

>> How we "get there" (reach some level of
>> understanding) is through all sorts of learning processes (flash
>> cards? books on tape?).
>>
>
> Again you are talking about something different. I am talking about how the
> brain works to produce mind whereas you are talking about the many ways
> human beings achieve and express understanding. These are not the same
> issues. Do you know my friend the clinical psychologist?

You're talking about something different again, missing my point as usual.

We're so good at playing "ships in the night" eh?

I'm talking about how pseudo-scientists wannabe philosopher BS artists
are wasting a lot of bandwidth with their easy-breezy crappola but its
a free country and actually a lot of my Mafia friends would like cushy
jobs as AI specialists so in a way I'm hoping to boost that business
considerably, tie it into casinos somehow, make "thinking machines"
and their design a much bigger business (more in the sense of
amplifying human consciousness though, like the Internet already does
--- cite "As We May Think" by Vannevar Bush, 1945).

At CSN, we're working on new language games aimed at funding more
field work of a valuable constructive nature, less digging a hole and
filling it with body bags.  Think of videogames that pay Greenpeace
out the back, because that's what the player wants on her record on
Facebook (Identity 2.0 = showing evidence of philanthropy, your ticket
to that next dream job (we're getting back to medical ethics again)).

>> Teaching others is a great way to learn as you'll get more feedback
>> and reality checks. Our concept of "understand" is very intertwined
>> with our concept of "able to teach others".
>>
>
> Again, this is not about the concept of understanding in all its
> manifestations. It's about a particular application of the term in a very
> specific domain, that of how brains produce minds. That creatures like us

Funny domain!  Nothing to do with science (yet), all comic book
science fiction -- in my account.

The Singularity stuff is even worse, borders on infantile.

Brains don't produce minds, they host conscious awareness which in
turn tunes in the noospheric shared mind or Self (or doesn't, in the
case of a truly mindless individuals).  We had a long ISEPP lecture on
this by some cognizant individuals who study this kind of thing.
Large civic auditorium.  I should dig up their names (had some
pleasant emails after).

We've not had any AI people through that I can remember, as a part of
this lecture series (Jane Goodall, Roger Penrose, Stephan Jay Gould,
Carl Sagan, Susan Haack, Frank Tipler...) as this is the Silicon
Forest (Torvalds in the neighborhood) and we don't tend to take that
stuff too seriously (too busy working on real computers).

>> So I see "understanding" is a social concept. Any approach that
>> focuses on a solo individual and/or something private going on inside
>> said individual (e.g. "in the head" or "in the gut"), is probably not
>> going to be very useful to me.
>>
>
> That's because you're talking about something different than I am talking
> about.
>

Inevitably, given I no speaka your English (brand of).

Doesn't mean we can't yak it up in a coffee shop context though, takes
all kinds.

>> Of course maybe there's a brand of behaviorism that's just fine with
>> people talking how they ordinarily talk, in which case we might say
>> Wittgenstein was a behaviorist of that kind even though he didn't
>> think so (because this brand hadn't been invented yet).
>>
>> That'd be kinda like calling Jesus a Christian -- he'd have had no
>> idea what that meant at first, might've gotten angry if someone
>> explained it to him (like he did with those money-changers).

>
> Well he didn't apply it to what I was saying, you did. So if anyone stands
> in need of forgiveness . . . well, it wouldn't be him!
>

Yeah, I was just picking up on his "Understanding is not a mental
process" thing, seemed highly relevant, as a counter to your preferred
authorities and their way of spinning it.

> I think Rorty may have been insufficiently rigorous with you! What process
> do you think is a "deep logical confusion"? Are you saying it is a logical
> confusion to suppose that there are processes going on in the functioning
> brain and that some of these may be implicated in the occurrence of
> consciousness (as constitued by features like understanding)? If so, how is

"constituted by features like"  "implicated in the occurence of" ...
sounds bureaucratic at least, if not confused.  Hardly a model of
crystal clarity.  But I'm sure you'll be able to polish it and make it
less crufty.  You're a good writer.  What will the diagrams look like?
 Gotta have some visuals.  Web links?  Youtubes?

> it a logical confusion to say so? What is illogical about supposing that
> brains have processes and that these are causally implicated in the
> occurrence of consciousness in brains?

Somewhat ungrammatical, yes, to say "consciousness in brains".

Is the container metaphor really apropos?  Do we want to say brains
"contain" consciousness such that consciousness is a bounded
phenomenon between the eyes and the ears, like liquid in a cup.

Is that how we've "committed to source" hitherto (version control talk
is apropos), i.e. is our shared public resource, this language of
ours, going to take well to these proposed ways of talking?  I'm
skeptical.

Does a car engine have horsepower inside, or does it deliver
horsepower as a service?  I favor the latter.  Brains provide a
service, are servers in a network (wet circuits vs. dry -- but they're
already totally interconnected e.g. optic nerves and LCDs comprise a
"cell-silicon interface", ergo a way to speed up and synchronize our
civilizational processes, help us keep up with Dr. Z (Zeitgeist)).

I was going with brain-hosted consciousness (as well as
unconsciousness (medical axis)) but am far less happy with
"consciousness in brains" let alone with "understanding" as a "feature
of consciousness" and "a brain process" on top of that.

That looks pretty unworkable, don't think that'll go anywhere without
a lot of help from off screen players.  Maybe someone will rescue it?
It's like screechy violin music, going off the rails vis-a-vis our
best heritage.  Just my judgment.  Could be wrong.

Anyway for now, lets just put this stuff in a holding pattern, shelve
it as science fiction, check back in awhile, see if its matured at
all.

>
>> Trying to make "understanding" mean a "brain process" of any kind is
>> not a smart design is how I'd rate this proposal -- it's a dead end,
>> grammatically speaking, too at odds with work that concept is already
>> committed to doing for a day job.
>>
>
> Again. . . I will try again. Note that I am not saying anything like 'we
> understand by performing some mental process or other'. I am saying that our
> understanding may well best be explained as a function of certain processes
> going on in the brain and that Edelman and Hawkins have separately offered
> interesting ways to describe the said processes.

Interesting to you.  Somewhat silly to me.

No engineer I know is waiting for scientists to "explain"
understanding.  I already understand what understanding means.  It's a
matter of language, not a matter of neuroscience.

"Consciousness" in a medical sense seems bound up with synapses and
what chemicals go across versus get blocked.  You're able to change
the quality of consciousness by messing with these synapses.  We
already know a lot about that.  I have stuff in my blogs about it,
e.g. that psychiatrist (Dr. Joe) gave us a talk on how Xanax works
etc.

"Consciousness" in a mindfulness sense (recalling my cube) also seems
affected by chemicals, psychotropics.

So back to mind-as-ecosystem (Bateson): we need to include the forest
floor, plant life.

Many mind sciences already do that though, as do the anthropologists.

What impresses me about the cog-sci people is how little they seem to
know about drugs of any kind, but maybe I just haven't studied enough
of their literature.

>
>> These mind-body NeoVictorians can try to spin it (that concept) to
>> their own purposes but I'm thinking there's too much inertia already
>> built up and their house of cards therefore has a short half life.
>> Time will tell.
>>
>
> What mind-bodyism do you think you see? If anything, there's is the reverse
> of any notion of mind-body dualism. But perhaps you can lay it out more
> clearly?

I've run out of time, but plan to get back to this.  Thanks for some
excellent dialog, per usual.

Kirby

Other related posts: