In 1953, the CIA began a program it called MKULTRA. Over the next two decades, the agency secretly dosed American and Canadian citizens with LSD and other psychoactive substances without their knowledge or consent, in efforts to develop mind control techniques. It conducted experiments at hospitals, prisons, and universities. It paid doctors to participate without disclosing their government funding. It destroyed most of its records in 1973 when investigations threatened exposure. The program was only confirmed through Freedom of Information Act requests and congressional hearings in 1975 and 1977.
COINTELPRO, the FBI's counterintelligence program, ran from 1956 to 1971. It surveilled, infiltrated, discredited, and destabilized civil rights organizations, antiwar groups, feminist organizations, and socialist political parties. It forged letters designed to create conflict between civil rights leaders. It sent anonymous threatening letters to activists. J. Edgar Hoover authorized sending a letter to Martin Luther King Jr. in 1964 that appeared to urge him to commit suicide. None of this was a conspiracy theory. All of it was a conspiracy.
This history matters for understanding why conspiracy theories, properly speaking the false or unsupported ones, remain difficult to address. The category of "conspiracy theory" includes both documented historical realities and entirely fabricated fictions, and the people who believe the latter frequently, and not entirely unreasonably, point to the former as justification. The psychological and sociological research on conspiracy belief must be understood against this background.
"The problem is not that people suspect conspiracies. They are often right to do so. The problem is a pattern of thinking that applies conspiratorial explanation indiscriminately, regardless of evidence." — Rob Brotherton, Suspicious Minds (2015)
Key Definitions
Conspiracy theory: An explanation that attributes events or circumstances to a secret, coordinated plot by a group of powerful actors, typically operating against the public interest. The term is value-neutral in that it can apply to theories that are well-evidenced (Watergate was, definitionally, a conspiracy before it was proven) or poorly evidenced, though popular usage typically implies the latter.
Proportionality bias: The cognitive tendency to expect that the size or significance of a cause should match the size or significance of its effect. Large events feel as though they should have large, intentional causes; the idea that a major historical event was caused by coincidence, individual failure, or an unremarkable chain of circumstances feels insufficient.
HADD (Hyperactive Agency Detection Device): Cognitive scientist Justin Barrett's term for the human tendency to perceive intentional agency behind ambiguous or random events. Evolved to detect predators and social threats, HADD generates false positives, perceiving intentional actors where there are none, in many modern contexts.
Epistemic needs: The cluster of psychological motivations related to knowledge, certainty, and explanatory closure. Conspiracy theories often fulfill epistemic needs by providing clear, coherent explanations for confusing or disturbing events.
Epistemic closure: The psychological need for definitive answers rather than comfort with uncertainty. Higher epistemic closure is associated with conspiracy belief in multiple studies.
Prebunking: Inoculating against conspiracy thinking by teaching people, in advance, about the rhetorical techniques conspiracy theories use, so that encountering those techniques triggers recognition rather than acceptance.
The Normal Psychology of Conspiracy Thinking
Rob Brotherton's 2015 book Suspicious Minds: Why We Believe Conspiracy Theories argues from the outset that conspiracy thinking is not a pathology of deficient minds but a consequence of entirely normal cognitive tendencies operating in particular conditions. The same mental machinery that makes humans effective at detecting threats, identifying patterns, and modeling the intentions of others also makes us susceptible to seeing threats, patterns, and intentions that are not there.
Three cognitive tendencies are particularly central.
Pattern perception is essential to human cognition. We are extraordinarily good at detecting regularities in noisy data, a capacity that enabled rapid learning throughout evolutionary history. The cost is that the same pattern-detection machinery produces false positives, finding patterns in genuinely random data. In social and political contexts, pattern perception generates the sense that events which are actually random or uncoordinated reflect a hidden organizing logic.
Agency detection, which Barrett termed the Hyperactive Agency Detection Device, refers to the tendency to perceive intentional agents behind events. In the ancestral environment, incorrectly perceiving agency where there was none (assuming the rustling in the bushes is a predator when it is only wind) was far less costly than incorrectly failing to perceive it when it was real. The result is a system biased toward agency attribution that, in modern contexts, generates the perception of intentional human actors behind events that were caused by impersonal forces, chance, or uncoordinated individual actions.
Proportionality bias is the intuitive sense that large effects must have large causes. The assassination of a president seems too significant to have been caused by a lone, mentally unstable individual acting without organizational support. A global pandemic seems too sweeping to have resulted from a single natural zoonotic transmission. These intuitions are psychologically powerful even when the evidence says otherwise, and conspiracy theories speak directly to them by providing large, organized causes for large events.
Karen Douglas and the Three Needs That Conspiracy Theories Fulfill
Karen Douglas at the University of Kent, with colleagues Robbie Sutton and Aleksandra Cichocka, published a highly influential 2017 review in Current Directions in Psychological Science titled "The Psychology of Conspiracy Theories." They identified three categories of psychological need that conspiracy theories characteristically fulfill.
Epistemic needs encompass the desire for knowledge, certainty, and explanatory closure. Conspiracy theories are explanatorily rich. They provide coherent, detailed accounts of why events occurred, who was responsible, and how the pieces fit together. In situations of genuine uncertainty, this explanatory richness is appealing even when the underlying evidence is weak. Douglas and colleagues found that epistemic anxiety, the discomfort of not knowing, is a significant predictor of conspiracy belief, and that conspiracy theories are particularly attractive in contexts where official explanations are absent, delayed, or perceived as inadequate.
Existential needs involve the desire to feel safe, to have a sense of control over one's environment, and to understand the forces that affect one's life. Paradoxically, conspiracy theories can make people feel more in control even when the conspiracies they posit are terrifying, because knowing who the enemy is provides the psychological comfort of understanding the threat. Studies have found that experimental inductions of powerlessness or lack of control increase conspiracy belief, while restoring a sense of control reduces it.
Social needs include the desire to maintain positive views of one's in-group and to identify clearly with that group against threatening out-groups. Conspiracy theories typically involve a clearly identified enemy who is persecuting, manipulating, or threatening the group with which the believer identifies. This narrative structure fulfills social identity needs powerfully, and research shows that in-group identity threats, feeling that your group is under attack or losing status, reliably increase conspiracy endorsement among group members.
The three-needs framework helps explain why simple factual correction of conspiracy beliefs is so often ineffective. Providing accurate information may address the epistemic component partially, but if the existential and social needs remain unmet, the conspiracy belief, or a different one that serves the same functions, tends to persist.
Jan-Willem van Prooijen: Uncertainty as the Root
Dutch psychologist Jan-Willem van Prooijen at VU Amsterdam has produced some of the most systematic experimental research on the psychological conditions that make conspiracy thinking more or less likely. His 2018 book The Psychology of Conspiracy Theories synthesizes this work.
Van Prooijen's central argument is that uncertainty is the primary situational trigger for conspiracy thinking. When people feel that they do not understand what is happening, that events are out of control, or that familiar institutions are unreliable, their minds search more actively for explanation, and conspiratorial explanations have structural advantages in this search: they are coherent, comprehensive, and free of the gaps and uncertainties that honest accounts of complex events necessarily contain.
His research has shown that societal upheaval, periods of economic crisis, rapid social change, political instability, and natural disasters, consistently increases conspiracy belief at the population level. Following the 2008 financial crisis, conspiracy theories about financial institutions and government complicity spread rapidly across the political spectrum in the US and Europe. Following COVID-19, the unprecedented combination of genuine uncertainty, expert disagreement, rapidly changing guidance, and global scope created ideal conditions for conspiracy theories to fill explanatory gaps.
Van Prooijen distinguishes between warranted and unwarranted suspicion, arguing that the same cognitive tendency produces both. A person who has experienced actual institutional betrayal, a community that was subjected to a real government surveillance program, a patient who was genuinely misled about medication, has evidence-based grounds for heightened institutional skepticism. The same skepticism, transferred to contexts where it is not evidence-based, becomes conspiracy thinking. This is one reason why conspiracy belief rates are not randomly distributed but cluster in communities with histories of actual documented maltreatment by authorities.
Who Believes Conspiracy Theories (And Who Does Not)
Popular accounts of conspiracy belief typically emphasize its association with low education, political extremism, or social marginalization. The research presents a more complicated picture.
Joseph Uscinski and Joseph Parent's 2014 book American Conspiracy Theories, based on analysis of thousands of letters to the editor published over a century and several national surveys, found that conspiracy belief does not conform to a simple demographic profile. It appears across education levels, income levels, and political affiliations. About half of Americans believe at least one major conspiracy theory in any given survey, and that proportion has been relatively stable over time.
What does differentiate conspiracy believers from non-believers is not primarily demographic but dispositional and situational. Uscinski and Parent found that analytical thinking (which can be measured separately from general intelligence) is a modest negative predictor of conspiracy belief. Gordon Pennycook and colleagues have found that the tendency toward intuitive rather than reflective thinking is associated with higher conspiracy endorsement, an effect that holds after controlling for education. Epistemic closure, the need for definitive answers, is a consistent predictor.
Critically, education alone does not inoculate. Dan Kahan at Yale has demonstrated repeatedly that higher scientific literacy does not reduce conspiracy thinking about politically charged topics. On issues where conspiracy theories align with cultural identity, more educated partisans are better at constructing sophisticated justifications for unfounded beliefs, not better at evaluating evidence impartially.
The Slippage Between Healthy Skepticism and Conspiracism
One of the most difficult features of conspiracy thinking to address is the genuine continuity between healthy institutional skepticism and unfounded conspiracism. The history documented at the beginning of this article is not exceptional. It includes the Tuskegee syphilis study (1932-1972), in which Black American men with syphilis were deliberately left untreated while researchers observed disease progression; the tobacco industry's decades-long campaign to manufacture doubt about the link between smoking and cancer; and the systematic cover-up of concussion research by the National Football League.
These are not conspiracy theories. They are documented historical facts. And they provide a reasonable evidential basis for wariness toward institutional claims, corporate health science, and government assurances. The psychological challenge is calibrating that wariness appropriately, applying it where evidence supports it and not applying it reflexively where evidence does not.
Brotherton argues that the key distinction is not between trusting and distrusting institutions but between evidence-responsive skepticism and epistemic closure. Evidence-responsive skepticism updates when evidence warrants; it can be convinced by new information. Conspiracy thinking typically exhibits immunity to evidence: any evidence against the conspiracy is interpreted as evidence of how deep the conspiracy goes.
This immunizing logic, which philosopher Karl Popper identified as a marker of pseudoscience, is one of the most reliable distinguishing features of genuine conspiracy thinking as opposed to ordinary institutional suspicion. When a theory cannot, even in principle, be disconfirmed by any possible evidence, it has left the domain of reasoning and entered the domain of unfalsifiable belief.
Social Media Amplification
Online platforms have transformed the ecology of conspiracy theory spread in ways that are structurally significant. Several mechanisms operate simultaneously.
Recommendation algorithms optimize for engagement, and conspiratorial content tends to generate high engagement through emotional arousal, identity signaling, and the psychological satisfaction of "secret knowledge." Researchers at MIT and elsewhere have documented that recommendation systems on YouTube, Facebook, and TikTok create pathways from mainstream to increasingly extreme content, a phenomenon that has been contested at the margins but supported directionally by multiple independent studies.
Echo chambers reduce exposure to disconfirming information not through complete information siloing (most people do have some cross-partisan exposure) but through the asymmetric emotional salience of in-group versus out-group content. Within communities organized around shared conspiracy beliefs, the social costs of expressing doubt are high and the rewards for elaborate elaboration are substantial.
Context collapse, the tendency of information originally produced for a specific audience to spread to audiences with very different interpretive frameworks, routinely strips context from claims in ways that make conspiratorial interpretations more plausible than they would be with full context intact.
Research by David Rand and colleagues at MIT found that the primary driver of sharing conspiratorial content on social media is not belief in its literal truth but sharing to signal group identity, demonstrate awareness, or achieve social recognition within communities where conspiracy content is valued. This means that much conspiracy content sharing is performative rather than epistemic, a finding with significant implications for how interventions are designed.
Real Conspiracies and Why They Matter for the Psychology
Any serious account of conspiracy theory psychology must grapple with the fact that the world contains real conspiracies. MKULTRA, COINTELPRO, Watergate, the tobacco industry's science manipulation, the fossil fuel industry's climate denial campaign, FIFA corruption, the Catholic Church's systematic cover-up of clerical abuse: these are documented conspiracies, confirmed by evidence, not by reflexive suspicion.
Michael Barkun at Syracuse University, whose work on conspiracy theory culture is among the most careful in the field, notes that the category of "conspiracy theory" emerged in its current derogatory usage partly through deliberate public relations strategy, specifically to discredit critics of the Warren Commission's findings about the Kennedy assassination. This does not validate Kennedy assassination conspiracy theories, but it does caution against using "conspiracy theory" as a category that automatically discredits rather than one that warrants evaluation.
The epistemically appropriate response to claims of conspiracy is neither automatic acceptance nor automatic rejection but calibrated evaluation based on evidence: Who would need to be involved? Is silence of that many people plausible over that time span? What is the evidence basis? What would the world look like if this were true, and does it match what we observe?
Historians have identified features of real conspiracies that distinguish them from fabricated ones: they typically involve smaller numbers of people than the claimed cover-up would require, they typically involve normal human organizational behavior rather than superhuman coordination, and they typically leave evidentiary traces that investigators eventually find.
Prebunking and Intervention Research
The most encouraging research on reducing conspiracy belief comes from the inoculation and prebunking literature, described in more detail in the companion article on misinformation.
Sander van der Linden at Cambridge, John Cook at George Mason, and colleagues have consistently found that teaching people the techniques of conspiracy thinking before they encounter specific conspiracy claims reduces susceptibility to those claims. The five techniques most commonly used in conspiracy narratives are: claiming that those who disagree are part of the conspiracy (anyone who challenges the theory is a shill), using fake experts or cherry-picked legitimate experts, misrepresenting scientific consensus, exploiting emotional appeals and fear, and using illogical reasoning that feels coherent.
When people learn to recognize these techniques as a category rather than fact-checking specific claims, they gain protection against new instances that they have never encountered before. This is the key advantage of prebunking over debunking.
Motivational interviewing techniques, originally developed for substance use counseling by William Miller and Stephen Rollnick, have shown promise when adapted to conspiracy belief. The approach avoids direct confrontation with the belief, which triggers identity defense, and instead asks genuine questions that help the person identify their own uncertainties and inconsistencies. This requires patience and genuine rather than strategic interest in the other person, which is both its strength and its limitation in mass application.
Practical Takeaways
Distinguish skepticism from immunization. Healthy institutional skepticism is evidence-responsive and specifiable: it can say what evidence would change its assessment. Conspiracy thinking is often immunized: any disconfirming evidence is incorporated as evidence of deeper conspiracy. Identify which mode you are in when evaluating institutional claims.
Apply proportionality bias awareness. When your intuition says an event is too significant to have been caused by uncoordinated or accidental factors, name that intuition. Then ask what the evidence actually shows about the cause, separately from what would feel satisfying.
Check the silence requirement. Most elaborate conspiracy theories require large numbers of people to maintain silence indefinitely. Ask how many people would need to know and not tell. Real conspiracies that have been exposed typically involved smaller circles of knowledge than the cover-up claimed for unconfirmed ones.
Learn manipulation techniques rather than cataloguing claims. The five conspiracy rhetoric techniques above apply across thousands of specific claims. Recognizing them provides general protection; cataloguing specific false claims does not.
Engage with the need, not just the belief. If someone believes a conspiracy theory, asking what uncertainty, threat, or institutional grievance gave it appeal is more productive than arguing about the claim's specifics. The psychological need will find another host if you remove the current one without addressing what it is doing.
Maintain the relationship. People update beliefs within trusting relationships, not under adversarial pressure. The goal of engaging with someone's conspiracy beliefs should be maintaining a relationship of enough trust and openness that updating is possible over time, not winning a debate in a single conversation.
References
- Brotherton, R. (2015). Suspicious Minds: Why We Believe Conspiracy Theories. Bloomsbury Sigma.
- Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. Current Directions in Psychological Science, 26(6), 538–542.
- Van Prooijen, J.-W. (2018). The Psychology of Conspiracy Theories. Routledge.
- Uscinski, J. E., & Parent, J. M. (2014). American Conspiracy Theories. Oxford University Press.
- Barkun, M. (2013). A Culture of Conspiracy: Apocalyptic Visions in Contemporary America (2nd ed.). University of California Press.
- Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8(4), 407–424.
- Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17(2), 202–227.
- Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Global Challenges, 1(2), 1600008.
- Barrett, J. L. (2000). Exploring the natural foundations of religion. Trends in Cognitive Sciences, 4(1), 29–34.
- Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50.
- Swami, V., et al. (2011). Conspiracist ideation in Britain and Austria. British Journal of Psychology, 102(3), 443–463.
- Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the post-truth era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.
Related reading: how to spot misinformation, how to think more critically, why disinformation spreads
Frequently Asked Questions
Are people who believe conspiracy theories less intelligent?
The research does not support the stereotype. While lower analytical thinking correlates modestly with conspiracy belief in some studies, the relationship is far smaller than popular culture suggests, and conspiracy belief is widespread across education and income levels. Joseph Uscinski and Joseph Parent's analysis of American survey data found no simple demographic profile for conspiracy believers. As Rob Brotherton notes in Suspicious Minds (2015), the cognitive tendencies that underlie conspiracy thinking, agency detection, pattern recognition, and proportionality bias, are normal features of human cognition operating in certain conditions, not markers of intellectual deficiency. High intelligence can even make conspiracy beliefs more sophisticated and internally coherent.
What psychological needs do conspiracy theories fulfill?
Karen Douglas, Robbie Sutton, and Aleksandra Cichocka's 2017 review in Current Directions in Psychological Science identified three categories of need that conspiracy theories serve. Epistemic needs: the need for knowledge, certainty, and explanatory closure, particularly in situations of uncertainty or confusion. Existential needs: the need to feel safe and to have some sense of control over one's environment, which conspiracy theories provide by identifying clear agents responsible for bad events. Social needs: the need to maintain positive views of one's own group and to distinguish the in-group from threatening out-groups. Conspiracy theories typically fulfill all three simultaneously, which makes them psychologically compelling even when the evidence for them is absent.
Why do smart people believe conspiracy theories?
Because the psychological mechanisms that underlie conspiracy thinking are not deficiencies but features of normal cognition operating in particular conditions. Agency detection, the tendency to perceive intentional agents behind ambiguous events, is adaptive in most contexts and produces false positives in others. Proportionality bias, the intuition that large events should have large causes, is a reasonable heuristic that misfires when the actual cause is small and accidental. Motivated reasoning means that people are better at constructing coherent defenses of conclusions they are attracted to, and high intelligence can make these defenses more sophisticated. Danielle Linden and colleagues found in studies that some highly educated people engage in what researchers call bullshit receptivity in domains outside their own expertise.
What makes someone more likely to believe conspiracy theories?
Jan-Willem van Prooijen's research identifies uncertainty and perceived threat as the primary situational triggers: people are more drawn to conspiracy explanations when they feel that events are out of control, that authorities are untrustworthy, or that their group is under threat. Disposition toward intuitive rather than analytical thinking correlates modestly with conspiracy belief. A history of actual exposure to real conspiracies or institutional betrayal increases warranted suspicion that can generalize to unwarranted cases. Social environments that normalize conspiracy explanations increase individual susceptibility. And epistemic closure, the need for definite answers rather than comfort with uncertainty, predicts conspiracy belief independently of education.
How do conspiracy theories spread online?
Social media platforms amplify conspiracy theories through multiple mechanisms. Recommendation algorithms optimize for engagement, and conspiratorial content tends to generate high engagement through emotional arousal and in-group identity signaling. Echo chambers reduce exposure to disconfirming information. The low friction of sharing means that emotionally compelling content spreads before fact-checking occurs. Cass Sunstein and Adrian Vermeule's analysis found that conspiracy theories spread particularly well when they fulfill both epistemic needs (explaining something confusing) and social needs (identifying a common enemy). Research by David Rand and colleagues found that the main driver of sharing conspiratorial content is not belief in its truth but sharing it to signal group identity.
What interventions actually reduce conspiracy thinking?
Prebunking, explained as inoculation, is the most robustly supported intervention: teaching people about the rhetorical techniques used in conspiracy theories before they encounter them significantly reduces susceptibility. Epistemic humility interventions that help people become more comfortable with uncertainty reduce the appeal of conspiracy theories that offer false certainty. Motivational interviewing techniques, which involve asking questions that lead the person to identify their own inconsistencies rather than directly contradicting their beliefs, are more effective than direct correction. Sander van der Linden and colleagues' research found that highlighting the scientific consensus on contested claims, combined with prebunking against the techniques used to manufacture doubt, produced lasting reductions in conspiracy belief.
How do you talk someone out of a conspiracy theory?
Direct contradiction is generally counterproductive, particularly for beliefs connected to identity or group membership. The evidence from persuasion research and clinical psychology suggests several more effective approaches. Ask questions rather than making assertions: what evidence would change your mind? How did you first come to believe this? What would the world look like if this were not true? Acknowledge legitimate grievances that may have made the conspiracy theory appealing, because many conspiracy theories emerge from real historical experiences of institutional dishonesty. Provide a satisfying alternative explanation that fills the same explanatory and psychological function as the conspiracy theory. And maintain the relationship: people update beliefs within relationships of trust, not under adversarial pressure.