On May 23, 2012 National Public Radio reported on some new training for CIA analysts that should make our nation’s national security apparatus more effective and more just. For the past three years the agency has been helping its analysis staff to manage the cognitive biases they, being human after all, bring to their critical work of figuring out what the hell is going in the world.
The training grows out of the work of Nobel laureate Daniel Kahneman and the late Amos Tversky who identified the effects of inherent biases on human cognition. Kahneman’s latest book, Thinking, Fast and Slow is not only very popular but appears to be getting read by a widening audience. The far-flung opinion empire that is Corridor Conversations has web logged on this subject before.
People who make consequential decisions, like docs, nurses, cops and CIA analysts owe it to us laypeople to be especially aware of the effects of cognitive bias on their decisions. Every industry where such decisions are made, with consequential signifying a decision having a large impact on life and/or liberty, should have their own Kent School.
A passage from the NPR report:
“The post-Iraq changes at the CIA also involve new analytic techniques, highlighted in a “tradecraft primer” in use at the agency since 2009. The manual is now used at the Sherman Kent School, the agency’s in-house training institute for new analysts. The manual opens with a section on the “mind-set” challenge.
There definitely was an emphasis in years past to say, ‘It is most likely going to go this way.’ We still have to make those calls, but now we try to explain what factors would take it in a different direction. Could the future be 180 degrees in an opposite direction? – Maria, CIA officer.
“If you’re only looking at [an issue] through one narrow view of the world, you’re not looking at the whole picture,” says John, who teaches at the Kent School. He and other CIA analysts were willing to be quoted only if their last names were not revealed.
“Your biases will get you things like a confirmation bias: ‘I’ve seen it before, so it must be happening again.’ Or an anchoring bias: ‘We’ve come up with that conclusion, and I think it’s true, and it’s not going to change.'”
“One exercise now in use at the CIA is called “Analysis of Competing Hypotheses.” Analysts who may be inclined toward one explanation for some notable development are forced to consider alternative explanations and to tally up all the evidence that is inconsistent with their favored hypothesis.”