On Mon, Jun 30, 2008 at 6:01 AM, Robert Paul <rpaul@xxxxxxxx> wrote: > > None of this should surprise any child psychologist or, sans jargon, > reasonably intelligent adult. The evidence for such mental processes isn't > spelled out here, but I wonder if Klein isn't making some sort of > transcendental argument (such arguments are more common in science than one > might suppose) of the form 'This is what, given x....x-sub-n, people do; so > there must be a string of intervening (mental?) processes that explain > how/why they do it.' Nothing at all wrong with that, but it strikes me as > highly theoretical and not especially compelling at this stage. For the cursory nature of the account you must blame me, not Klein, whose book Sources of Power: How People Make Decisions, is fascinating reading. One particularly appealing aspect of his discussion is that it is not at all a purely theoretical account of processes assumed to be hard-wired into the brain. It is, on the contrary, a compelling case for the importance of education, training and experience for expanding the range and enriching the content of the models evoked during pattern recognition. It is precisely the sparseness and sketchiness of the models available to the novice that makes their behavior dangerous. The practical applications of the theory lie in research and development of training programs for emergency responders, military officers, bond traders and other folks who are called upon to make decisions with serious consequences for others as well as themselves in stressful, time-pressured circumstances. The research in question is often occasioned by catastrophic mistakes that alert relevant authorities to the need to modify training to prevent repeated catastrophes. One memorable example was the effort to try to figure out why an civilian Iranian airliner was taken to be a shore-launched ship-killer missile and, consequently, shot down by a U.S. Navy Aegis cruiser operating in the Persian Gulf (you may remember the case). This involved close attention to the image of the airliner's trajectory as it appeared on the cruiser's radar displays and the ways in which it did, in fact, appear to fit the ship-killer missile profile the sailor's manning the radar were trained to recognize in the blips moving across the radar screens. Reflecting on this case, I noted to myself that the frequency of catastrophic mistakes by military organizations may be related (1) to frequent turnover in personnel, which increases the likelihood of novices watching the radar screens, (2) training regimes that assume hostile encounters, and (3) the need to make decisions under time pressure when the presumption of threat is strong and the consequences of wrong decisions can, indeed, be catastrophic, whatever choice is made. In this case, the decision to activate the cruiser's anti-missile defenses resulted in the shooting down of the civilian airliner. If, however, the airliner had, in fact, turned out to be a ship-killer missile attacking the cruiser and the missile-defense system had not been activated, the loss of life could have been equally high. Thus the frequency with which, as military folk are likely to put it, "Shit happens." John -- John McCreery The Word Works, Ltd., Yokohama, JAPAN Tel. +81-45-314-9324 http://www.wordworks.jp/