In a famous passage from Ayn Rand’s novel Atlas Shrugged, Francisco tells Dagny, “Contradictions do not exist. Whenever you think that you are facing a contradiction, check your premises. You will find that one of them is wrong [1].” Or as someone else once said, “My mind is made up; stop confusing me with facts.”

What both of these lines sum up nicely are the dangers that our premises, assumptions, stereotypes and prejudices pose to our thinking and, by extension, our decision-making. One such cognitive trap is called Confirmation Bias. Research that I conducted shortly after 9/11 illustrates this well-known principle, which describes our tendency to prefer information that supports our current beliefs, regardless of its accuracy.

Late in 2001, as the country began grappling with the deadly attacks in New York, Washington and Pennsylvania, policy makers began to frame the threat, its source and underlying causes. Among the assertions made at the time were that most radical terrorists suffered from poverty, a lack of education, and ignorance of Western democracy. The implication: By improving economic conditions, education and awareness of democracy, the U.S. and its allies could undermine the deadly movements fomenting hatred and violence against the West in the Muslim world.

My research captured the background of over 100 known radicals, philosophers of terror and terrorists dating back to the founding of the Muslim Brotherhood in 1920. This offered a good opportunity to learn about the movement’s long political history before the situation evolved as a result of the 9/11 attacks and the West’s response. To avoid biases from analysts with common education and training, I employed several experienced researchers who knew how to dig for information but were not experts on this particular subject. Most entries into our database had multiple open sources. We then queried for a variety of factors, including education, culture, economics and exposure to the West.

In contrast to common assumptions, economic data indicated that the vast majority of the subjects in our research came from wealthy or middle class families (based on their countries’ standards); their level of education and exposure to the U.S. and Western world are summarized in the charts below. Our research painted a different picture from that offered by many western analysts, security and policy experts.  And these were not the only contradictions that raised concerns about the policies, security checks and other methodologies being promoted to identify and defeat the threats from radical Islamist organizations.

The response to our research was surprising—when we presented the data to a number of policy makers, analysts and academics, many of them rejected it outright.  In retrospect, it seems that many of these well-educated, intelligent people were committed to a shared reality and were not open to discussing research that challenged their assumptions. It took years before a string of arrests and attempted attacks by middle class, well educated, committed members of various Islamist terrorist groups began to change those original perceptions.

The stubborn attachment to existing assumptions described above is a good example of Confirmation Bias. It’s a mindset that leads people to attach greater credibility to facts that support their current beliefs, while rejecting or filtering information that contradicts them. Confirmation Bias has played a role in numerous accidents and failures across industries and government entities, from nuclear power plants to NASA flight operations [2]. It has been researched and discussed at length by scholars such as Edward Russo and Paul Schoemaker (Winning Decisions: Getting It Right the First Time, Doubleday, 2001) and Laura J. Kray and Adam D. Galinsky (The Debiasing Effect of Counterfactual Mind-Sets: Increasing the Search for Disconfirmatory Information in Group Decisions, AP Academic Press, 2003), among others.

Fortunately, it’s possible to mitigate the effects of Confirmation Bias on decision-making, and I’ll discuss a few proven techniques in an upcoming post.


Have your decisions ever been impacted by Confirmation Bias? Would you know if they had? Have you seen it happen to others? Please share your experiences in the comments below.


[1] Rand, Ayn. Atlas Shrugged. Part I, Chapter VII, Penguin Books (1957).

[2] Deal, Duane W. “Columbia Investigation & Lessons—Acceptance of Foam Shedding,” Part 2 of a presentation made at the 2006 Aerospace in the News Executive Symposium on Thursday, March 2, 2006 at the Omni Royal Orleans Hotel in New Orleans, Louisiana.

Copyright 2011 by Ozzie Paez.