Real Examples of Cognitive Bias in Action: How Systematic Thinking Errors Have Shaped Investments, Wars, Disasters, and Everyday Decisions

In 2015, Elizabeth Holmes appeared on the cover of Forbes magazine, the youngest self-made female billionaire in America. Her company, Theranos, claimed to have developed technology that could run hundreds of medical tests from a single drop of blood. The board of directors included Henry Kissinger, George Shultz, and former Secretaries of Defense William Perry and James Mattis. Major investors had poured hundreds of millions of dollars into the company. Walgreens had signed a deal to install Theranos testing stations in thousands of stores.

The technology did not work. It had never worked. Internal employees who raised concerns were threatened with lawsuits or fired. The tests produced unreliable results that put patients at risk. In 2018, Holmes was indicted on multiple counts of wire fraud. In 2022, she was convicted.

The Theranos story is a case study in cognitive bias--the systematic patterns of deviation from rational judgment that affect human decision-making. The investors, board members, journalists, and partners who believed in Theranos were not stupid. Many were among the most accomplished, experienced, and intelligent people in American business and government. They were, however, subject to the same cognitive biases that affect all human thinking--biases that no amount of intelligence, experience, or credentials can eliminate.


Confirmation Bias: Seeing What You Want to See

The Mechanism

Confirmation bias is the tendency to search for, interpret, favor, and recall information that confirms one's existing beliefs while giving less attention to information that contradicts them. It is arguably the most pervasive and consequential cognitive bias, affecting everything from scientific research to political beliefs to investment decisions.

Confirmation bias operates through several mechanisms:

  • Selective attention: Noticing evidence that supports your belief while overlooking evidence that contradicts it
  • Selective interpretation: Interpreting ambiguous evidence as supporting your belief
  • Selective recall: Remembering confirming evidence more readily than disconfirming evidence
  • Selective search: Seeking out information sources that are likely to confirm your belief

How Confirmation Bias Affected Theranos Investors

The Theranos investors demonstrated confirmation bias at every stage. They wanted to believe in the revolutionary technology because it aligned with a compelling narrative: a young, brilliant Stanford dropout disrupting a stagnant healthcare industry, the "female Steve Jobs" transforming blood testing the way Apple transformed computing.

This narrative was so attractive that investors interpreted ambiguous signals as confirming it:

  • Holmes's secrecy about the technology was interpreted as prudent protection of trade secrets rather than concealment of non-functional technology
  • The prestigious board members were interpreted as evidence of legitimacy rather than as evidence that the board lacked relevant scientific expertise
  • The company's rapid growth in partnerships and media attention was interpreted as market validation rather than as the result of effective marketing of an unproven product
  • When journalist John Carreyrou began investigating Theranos in 2015, investors and board members dismissed his reporting as uninformed criticism rather than investigating his claims

The critical failure was that investors did not seek disconfirming evidence. They did not insist on independent validation of the technology. They did not consult with experts in blood diagnostics who could have identified the technological claims as implausible. They did not investigate the concerns of former employees. They looked for reasons to believe, found them, and stopped looking.

Other Cases of Confirmation Bias

Intelligence failures. The 2003 invasion of Iraq was justified in part by intelligence assessments that Saddam Hussein possessed weapons of mass destruction. Post-invasion investigations by the Iraq Survey Group found no WMD. The Senate Intelligence Committee's investigation concluded that intelligence analysts had exhibited confirmation bias: they interpreted ambiguous evidence as confirming the pre-existing belief that Iraq had WMD, and they dismissed contradictory evidence as Iraqi deception.

Medical diagnosis. Physicians who form an initial diagnosis early in a patient encounter are susceptible to confirmation bias: they interpret subsequent symptoms and test results as confirming the initial diagnosis while discounting evidence that suggests an alternative. Research by Croskerry estimates that cognitive biases, including confirmation bias, contribute to approximately 10-15 percent of diagnostic errors.


Sunk Cost Fallacy: Throwing Good Money After Bad

The Mechanism

The sunk cost fallacy is the tendency to continue investing in a project, relationship, or course of action because of the resources already invested, even when continuing is irrational because those resources cannot be recovered.

Rational decision-making evaluates future costs and benefits regardless of past investment. If you have spent $100 on a non-refundable concert ticket but feel sick on the evening of the concert, the rational decision depends on whether attending the concert in your current state would produce more enjoyment than staying home. The $100 is irrelevant to this calculation because it is spent either way. But most people feel compelled to attend precisely because they paid $100--they do not want to "waste" the investment.

The Concorde: Supersonic Sunk Costs

The Concorde supersonic airliner is such a classic example of the sunk cost fallacy that economists sometimes call it the "Concorde fallacy." The Anglo-French project to build a supersonic passenger aircraft began in 1962. By the late 1960s, it was clear that the project would be far more expensive than originally estimated and that the aircraft would have limited commercial viability. The sonic boom it produced was so loud that most countries banned supersonic flight over land, restricting the Concorde to transatlantic routes. Airlines were not ordering the aircraft in the numbers needed to justify the investment.

Despite these signals, the British and French governments continued funding the project for over a decade. The primary justification was the money already spent: canceling the project would "waste" the billions already invested, and the political embarrassment of admitting failure was too costly to contemplate. The Concorde eventually entered service in 1976, operated at a financial loss for most of its existence, and was retired in 2003.

The rational analysis was straightforward at multiple decision points: the future costs of continuing the project exceeded the future benefits, regardless of the past investment. But the sunk costs created a psychological momentum that made rational analysis politically and psychologically impossible.

The Vietnam War

The American escalation in Vietnam has been analyzed as a large-scale sunk cost fallacy. As the war progressed and the costs mounted--in lives, money, and political capital--the investment already made became an argument for continued investment. "We cannot withdraw because that would mean that those who died will have died in vain" is a sunk cost argument: it uses past costs to justify future costs rather than evaluating future costs and benefits independently.

Daniel Ellsberg, who leaked the Pentagon Papers in 1971, described the decision-making dynamic: each president inherited the war from his predecessor, and each faced the sunk cost pressure of having already committed American lives and resources. Withdrawing would mean acknowledging that the investment was wasted, which was politically and psychologically unbearable.


Availability Bias: Vivid Events Distort Risk Perception

The Mechanism

Availability bias (or the availability heuristic) is the tendency to judge the frequency or probability of events based on how easily examples come to mind. Events that are vivid, recent, emotionally charged, or heavily covered by media feel more common than they actually are, while events that are mundane, distant, or abstract feel less common than they are.

Flying vs. Driving After Plane Crashes

After a major plane crash, fear of flying increases measurably, and some people switch from flying to driving for trips they would normally fly. This switch is irrational from a risk perspective: per mile traveled, driving is approximately 100 times more dangerous than flying. But the vivid, emotional, heavily-covered plane crash makes the risk of flying feel much larger than the statistical risk of driving, which kills tens of thousands of people annually through accidents that rarely make national news.

Research by Gerd Gigerenzer found that in the three months following the September 11, 2001, terrorist attacks, an estimated 1,595 additional Americans died in car accidents above the statistical baseline, likely because they chose to drive rather than fly. The fear of a statistically rare event (terrorism on aircraft) led to behavior that increased exposure to a statistically common risk (automobile fatality).

Shark Attacks and Lightning

Shark attacks receive enormous media coverage. Lightning strikes receive almost none. In the United States, lightning kills approximately 20 people per year while sharks kill approximately 1. But ask most people which they fear more, and they will say sharks. The availability of vivid shark attack imagery (reinforced by media coverage and films like Jaws) makes shark attacks feel more common and more threatening than the statistics warrant.

Availability Bias in Policy

Availability bias affects public policy when legislators and regulators respond to vivid, publicized events rather than statistical analysis. A single dramatic school shooting produces legislative action that a year of gun violence deaths distributed across thousands of less-visible incidents does not. A single dramatic food poisoning outbreak produces regulatory action that the daily toll of food-borne illness does not.

The problem is not that dramatic events do not deserve response--they often do. The problem is that availability bias can produce misallocation of resources: disproportionate resources directed toward vivid, publicized risks while more common but less visible risks remain underfunded and underaddressed.

Bias Mechanism Real-World Example Consequence
Confirmation Seeking/favoring confirming evidence Theranos investors ignoring red flags $700M+ invested in non-functional technology
Sunk cost Continuing due to past investment Concorde project continuing despite losses Billions wasted on commercially unviable aircraft
Availability Vivid events distort probability estimates Driving instead of flying after crash 1,595 excess driving deaths post-9/11
Anchoring First number influences judgment Initial salary offer shapes negotiation Systematic under/overpayment based on arbitrary starting point
Overconfidence Overestimating own knowledge/ability LTCM assuming models were complete Near-collapse of global financial system
Dunning-Kruger Incompetence prevents recognizing incompetence Novice investors trading confidently Financial losses from uninformed trading
Survivorship Studying only successes, not failures Analyzing only successful startups Misleading lessons about what causes success

Anchoring Bias: The Power of the First Number

The Mechanism

Anchoring bias is the tendency for the first piece of information encountered (the "anchor") to disproportionately influence subsequent judgments, even when the anchor is arbitrary or irrelevant.

In a famous experiment by Tversky and Kahneman, participants spun a wheel that landed on either 10 or 65, then were asked to estimate the percentage of African countries in the United Nations. Participants who saw 10 estimated an average of 25 percent. Participants who saw 65 estimated an average of 45 percent. The random number from the wheel--which had no relationship to the question--shifted estimates by approximately 20 percentage points.

Anchoring in Salary Negotiation

Anchoring bias is powerfully demonstrated in salary negotiation. The first number mentioned in a salary discussion becomes the anchor around which subsequent negotiation revolves. If an employer offers $70,000, the negotiation will revolve around $70,000--the candidate might counter at $80,000 or $85,000, but is unlikely to counter at $120,000. If the employer had offered $100,000, the same candidate might have countered at $110,000 or $115,000.

Research by Galinsky and Mussweiler found that the party who makes the first offer in a negotiation achieves better outcomes, because their offer establishes the anchor. This finding has practical implications: in salary negotiations, making the first offer (or at least ensuring the first number discussed is favorable) provides a significant advantage.

Anchoring in Real Estate

Real estate pricing demonstrates anchoring bias in high-stakes contexts. Northcraft and Neale's 1987 study asked professional real estate agents to evaluate a property after being given a listing price. The listing price was randomly varied across groups. Despite their expertise, the agents' property valuations were significantly influenced by the listing price--the anchor--even though professional agents should have been able to value the property based on comparable sales and property characteristics independent of the listing price.


Overconfidence Bias: When Certainty Exceeds Accuracy

The Mechanism

Overconfidence bias is the tendency to overestimate one's own knowledge, abilities, or the precision of one's predictions. It manifests in several forms: overestimation of actual performance, over-placement (believing you are better than others), and over-precision (excessive certainty in the accuracy of your beliefs).

How Overconfidence Harmed Long-Term Capital Management

Long-Term Capital Management (LTCM) was a hedge fund founded in 1994 by John Meriwether, a legendary bond trader, with a team that included Myron Scholes and Robert Merton, who would jointly receive the Nobel Prize in Economics in 1997 for their work on options pricing theory.

LTCM's strategy was based on sophisticated mathematical models that identified pricing discrepancies between related financial instruments and profited from their convergence. The fund was enormously successful in its first years, generating returns of over 40 percent annually.

The overconfidence was in the models. LTCM's partners believed their models captured the essential dynamics of financial markets with sufficient accuracy to justify enormous leverage--borrowing far more than their capital base to amplify returns. At its peak, LTCM had approximately $5 billion in equity and $125 billion in assets, a leverage ratio of 25:1.

In August 1998, Russia defaulted on its government debt, triggering a flight to quality in global financial markets. LTCM's models had not anticipated this scenario because they were based on historical data that did not include such extreme events. The fund lost $4.6 billion in less than four months. The Federal Reserve organized a bailout because LTCM's collapse threatened to destabilize the global financial system.

The LTCM failure illustrates overconfidence bias at the highest levels of expertise: Nobel Prize-winning economists overestimated the completeness and accuracy of their models, leading to risk-taking that nearly produced systemic financial collapse.


The Dunning-Kruger Effect: Not Knowing What You Do Not Know

The Mechanism

The Dunning-Kruger effect, documented by psychologists David Dunning and Justin Kruger in 1999, describes the pattern in which people with limited knowledge or competence in a domain tend to overestimate their ability, while people with high competence tend to slightly underestimate theirs.

The mechanism is that the skills needed to produce correct responses are the same skills needed to recognize what a correct response looks like. A person who lacks expertise in a domain lacks the very expertise needed to recognize their own lack of expertise.

Real-World Manifestations

The Dunning-Kruger effect is visible across domains:

Novice investors often trade more frequently and with more confidence than experienced investors, producing worse returns. Research by Barber and Odean found that the most active individual traders underperformed the market by an average of 6.5 percentage points annually, partly because overconfidence led them to trade too frequently, incurring transaction costs and making poor timing decisions.

Self-assessed driving ability. In surveys, approximately 80-90 percent of drivers rate themselves as "above average," a statistical impossibility. This overestimation is most pronounced among the least skilled drivers, who lack the driving expertise needed to recognize the gap between their performance and skilled performance.

Organizational decision-making. Managers who are newest to their roles often make the most confident decisions, because they have not yet developed the expertise needed to recognize the complexity and ambiguity of the problems they face. More experienced managers may appear less decisive because they understand--and appropriately respect--the difficulty of the decisions before them.


Survivorship Bias: The Missing Data Problem

The Mechanism

Survivorship bias is the error of focusing on entities that "survived" a selection process while overlooking those that did not, producing misleading conclusions about what caused success.

The WWII Aircraft Example

The most famous illustration of survivorship bias comes from World War II. The US military examined returning aircraft for damage patterns to determine where to add armor plating. The bombers that returned showed damage concentrated in certain areas--the fuselage, fuel system, and other non-critical areas.

The initial recommendation was to add armor to the areas with the most damage. Statistician Abraham Wald recognized that this analysis suffered from survivorship bias: the aircraft that returned were, by definition, the ones that survived hits to those areas. The aircraft that were hit in other areas--the engines, the cockpit--did not return. Wald recommended armoring the areas with the least damage on returning aircraft, because those were the areas where hits were fatal.

Survivorship Bias in Business

Business journalism and entrepreneurial culture are heavily influenced by survivorship bias. Books and articles about successful entrepreneurs study the habits, strategies, and characteristics of people who succeeded, then present these as causes of success. But without studying the people who had the same habits, strategies, and characteristics and failed, it is impossible to know whether these factors actually caused success or merely co-occurred with it.

Steve Jobs dropped out of college, was famously difficult to work with, and trusted his intuition over market research. These traits are often cited as reasons for his success. But there are thousands of college dropouts who were difficult to work with and trusted their intuition who failed completely. Without studying the failures, we cannot determine whether these traits caused success, were irrelevant to it, or actually reduced the probability of success while other factors (timing, talent, specific product decisions) were decisive.


Can Awareness of Bias Reduce Its Effects?

The most common recommendation for combating cognitive bias is awareness: if you know about confirmation bias, you can watch for it in your own thinking. This recommendation is well-intentioned but insufficient. Research consistently shows that knowing about cognitive biases does not reliably prevent them.

Bias is not primarily a knowledge problem. It is a processing problem. Biases arise from the automatic, intuitive cognitive processes that operate below conscious awareness. Knowing that confirmation bias exists does not prevent your brain from automatically favoring confirming information any more than knowing about optical illusions prevents your brain from perceiving them.

More effective approaches to reducing bias effects include:

External checks and processes. Checklists, structured decision processes, and formal evaluation criteria reduce the influence of intuitive biases by requiring systematic consideration of evidence. Atul Gawande's research on surgical checklists demonstrated that simple checklists significantly reduced surgical complications by ensuring that critical steps were not skipped due to overconfidence or inattention.

Diverse perspectives. Groups composed of people with different backgrounds, expertise, and perspectives are less susceptible to groupthink and confirmation bias than homogeneous groups. Diversity introduces disconfirming perspectives that a single individual or a homogeneous group might not generate.

Pre-mortems. Psychologist Gary Klein developed the "pre-mortem" technique: before making a decision, imagine that the decision has been implemented and has failed spectacularly, then generate explanations for why it failed. This technique leverages prospective hindsight to surface potential problems that optimism bias might otherwise conceal.

Accountability. When decision-makers know that they will be required to justify their reasoning to others, they are more likely to engage in careful, systematic analysis rather than relying on intuitive judgment. Accountability does not eliminate bias, but it increases the cognitive effort invested in decision-making.

Decision journals. Recording the reasoning behind important decisions and reviewing those records later allows individuals and organizations to identify patterns of biased thinking that are invisible in the moment but clear in retrospect.

The fundamental insight about cognitive bias is that it is a feature of human cognition, not a bug. Biases exist because they are usually useful: confirmation bias helps us maintain stable beliefs in a complex world, availability bias provides quick risk assessments, anchoring provides starting points for estimation. The problem is that these useful shortcuts produce systematic errors in predictable circumstances. The goal is not to eliminate biases--which is impossible--but to create decision-making environments that minimize their harmful effects while preserving the speed and efficiency that heuristic thinking provides.


References and Further Reading

  1. Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux. https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow

  2. Tversky, A. & Kahneman, D. (1974). "Judgment under Uncertainty: Heuristics and Biases." Science, 185(4157), 1124-1131. https://doi.org/10.1126/science.185.4157.1124

  3. Dunning, D. & Kruger, J. (1999). "Unskilled and Unaware of It." Journal of Personality and Social Psychology, 77(6), 1121-1134. https://doi.org/10.1037/0022-3514.77.6.1121

  4. Carreyrou, J. (2018). Bad Blood: Secrets and Lies in a Silicon Valley Startup. Knopf. https://en.wikipedia.org/wiki/Bad_Blood_(book)

  5. Lowenstein, R. (2000). When Genius Failed: The Rise and Fall of Long-Term Capital Management. Random House. https://en.wikipedia.org/wiki/When_Genius_Failed

  6. Gigerenzer, G. (2004). "Dread Risk, September 11, and Fatal Traffic Accidents." Psychological Science, 15(4), 286-287. https://doi.org/10.1111/j.0956-7976.2004.00668.x

  7. Galinsky, A.D. & Mussweiler, T. (2001). "First Offers as Anchors." Journal of Personality and Social Psychology, 81(4), 657-669. https://doi.org/10.1037/0022-3514.81.4.657

  8. Barber, B.M. & Odean, T. (2000). "Trading Is Hazardous to Your Wealth." Journal of Finance, 55(2), 773-806. https://doi.org/10.1111/0022-1082.00226

  9. Klein, G. (2007). "Performing a Project Premortem." Harvard Business Review. https://hbr.org/2007/09/performing-a-project-premortem

  10. Croskerry, P. (2003). "The Importance of Cognitive Errors in Diagnosis." Academic Medicine, 78(8), 775-780. https://doi.org/10.1097/00001888-200308000-00003

  11. Northcraft, G.B. & Neale, M.A. (1987). "Experts, Amateurs, and Real Estate." Organizational Behavior and Human Decision Processes, 39(1), 84-97. https://doi.org/10.1016/0749-5978(87)90025-X

  12. Gawande, A. (2009). The Checklist Manifesto: How to Get Things Right. Metropolitan Books. https://en.wikipedia.org/wiki/The_Checklist_Manifesto

  13. Wald, A. (1943). "A Method of Estimating Plane Vulnerability Based on Damage of Survivors." Statistical Research Group, Columbia University. https://en.wikipedia.org/wiki/Abraham_Wald

  14. Nickerson, R.S. (1998). "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises." Review of General Psychology, 2(2), 175-220. https://doi.org/10.1037/1089-2680.2.2.175