On the morning of November 22, 1963, President John F. Kennedy was shot in Dallas. Within hours, before any official investigation had concluded, alternative explanations were circulating. Within days, they had proliferated into a self-sustaining ecosystem of competing narratives that continues to generate new variants six decades later. The Warren Commission's 1964 conclusion that a single gunman acted alone has never managed to displace the conviction, held by a majority of Americans in most polling conducted since, that a larger conspiracy was responsible. This is striking not because the conspiracy believers are necessarily wrong, but because of what it reveals about how the human mind approaches unexplained events involving powerful actors.
Conspiracy theories are not a marginal pathology. They are a predictable output of cognitive systems that evolved to detect intentional agency, find meaningful patterns, and protect against social betrayal. When those systems operate in high-uncertainty environments with incomplete information and historically justified distrust of authority, conspiracy theorising is not a failure of reason. It is an overdetermined output of normal reasoning processes applied in conditions they were not designed to handle.
What researchers have learned over the past two decades is that the question is not really 'why do people believe conspiracy theories', since that question has a relatively clear set of answers. The harder and more important question is what it would take to design information environments and educational frameworks that allow people to maintain appropriate scepticism toward authority without becoming susceptible to the substitution of evidence-free narratives for genuinely uncertain truths.
"Conspiracy theories are not simply false beliefs held by ignorant people. They are the product of motivated reasoning, social identity, and genuine uncertainty, and they cannot be countered by facts alone." -- Jan-Willem van Prooijen, 'The Psychology of Conspiracy Theories', 2018
Key Definitions
Proportionality bias: The intuitive assumption that events of large significance must have causes of comparable magnitude, leading to resistance against simple or contingent explanations for major events.
Epistemic anxiety: Psychological distress arising from uncertainty and the inability to explain important events, which conspiracy theories can temporarily relieve by providing clear, if false, causal frameworks.
Pattern recognition overdrive: The tendency to perceive meaningful, intentional patterns in random or coincidental data, linked to what cognitive scientists call 'apophenia', the experience of seeing meaningful connections between unrelated things.
Gateway belief: A moderately unusual belief that, once accepted, increases susceptibility to progressively more extreme conspiratorial claims by normalising the conspiratorial reasoning framework.
Inoculation theory: The psychological approach of pre-emptively exposing people to weakened misinformation and the rhetorical techniques used to spread it, to build resistance before full-strength exposure occurs.
The Evolutionary Roots of Conspiratorial Thinking
The cognitive architecture underlying conspiracy belief is not aberrant. It is the product of evolutionary pressures that made pattern detection and agency attribution survival-critical. In ancestral environments, the cost of failing to detect a predator or a hostile social actor was death. The cost of falsely detecting one where none existed was, at most, wasted energy and minor social embarrassment. This asymmetry strongly favoured the evolution of what cognitive scientist Michael Shermer calls 'patternicity', the tendency to find meaningful patterns even in random data.
Alongside patternicity operates what Shermer calls 'agenticity', the tendency to attribute intentional agency to events. Unexplained noises in the darkness, sudden illness, unexpected crop failure: in environments where human and animal agents frequently were responsible for adverse events, the habit of immediately seeking an agent behind every effect was adaptive. These tendencies are not cognitive flaws to be corrected. They are the default settings of a system that worked well for most of human history.
The problem is that these settings are poorly calibrated for a world in which genuinely random processes, complex emergent phenomena, and the uncoordinated actions of millions of independent actors produce large-scale consequences that superficially resemble the fingerprints of intentional planning. Global financial crises, pandemics, and climate change are not produced by conspiracies, but they have many of the surface features that agency-detecting systems associate with intentional coordination.
Jan-Willem van Prooijen and the Conspiracy Mentality
Dutch psychologist Jan-Willem van Prooijen has conducted some of the most rigorous empirical research on the psychology of conspiracy belief. His work, summarised in his 2018 book 'The Psychology of Conspiracy Theories', identifies several consistent predictors of conspiracy ideation that cut across specific belief content.
Van Prooijen has found that conspiracy ideation is not primarily a function of specific beliefs but of a generalised 'conspiracy mentality', a stable tendency to explain events through the lens of hidden malevolent actors regardless of evidence quality. This mentality predicts belief in mutually contradictory conspiracy theories simultaneously, which is one of the most diagnostic features of conspiracy ideation as opposed to genuine sceptical inquiry. In one frequently cited study, participants who endorsed one conspiracy theory were significantly more likely to endorse a second theory that was logically incompatible with the first.
His research identifies two primary psychological drivers. The first is epistemic: conspiracy beliefs respond to the need for cognitive closure, the uncomfortable feeling of not knowing. Events that are ambiguous, complex, or genuinely uncertain create pressure toward any explanation that resolves the ambiguity, even a disturbing one. The second driver is existential: conspiracy beliefs respond to feelings of powerlessness and threat by identifying a responsible agent. If bad things are happening because of a conspiracy, the world is at least explicable, and potentially the conspiracy could be stopped.
Van Prooijen and colleagues have found in experimental studies that inducing feelings of anxiety, unpredictability, or low personal control reliably increases conspiracy ideation, even when the conspiracy theory being measured is unrelated to the induced stress. This suggests that conspiracy belief functions partly as a coping mechanism for existential distress.
Proportionality Bias: When Scale Misleads
One of the most well-documented contributors to conspiracy belief is what psychologist Rob Brotherton, author of 'Suspicious Minds' (2015), calls proportionality bias. The Kennedy assassination is its canonical example. At a cognitive level, an event of the magnitude of a presidential assassination demands an explanation of corresponding scale. A single, politically marginal gunman is not, for most people, a satisfying explanation. The disproportion feels wrong.
This bias is compounded by what statisticians call the base rate neglect of small causes. History is, in fact, full of large consequences produced by small or contingent causes. The assassination of Archduke Franz Ferdinand, which triggered World War I, was attempted and failed multiple times on the same morning before succeeding by chance. The stock market crash of 1929 was produced not by a master plan but by the cascading interactions of thousands of individual panics. The human cognitive system is poorly equipped to accept these kinds of causal structures.
Research by Brotherton and Christopher French (2014) found that proportionality bias was measurably stronger in individuals who scored higher on a generalised conspiracy belief scale, suggesting that it is not merely a situational response to specific events but part of a broader cognitive style.
Social Identity and the Group Function of Conspiracy Beliefs
Beyond individual cognition, conspiracy theories perform important social functions. Research by Manos Antonakis and colleagues has found that conspiracy belief is strongly linked to social identity, specifically to the conviction that one's in-group is under threat from a powerful out-group. Conspiracy theories provide a coherent narrative for that threat and, critically, a flattering account of the in-group as perceptive truth-seekers surrounded by the deceived masses.
This social function explains a pattern that purely epistemic accounts struggle to address: the fact that conspiracy beliefs are often highly resistant to evidence, including evidence that conspiracy believers themselves demanded. If conspiracy belief were purely a function of seeking information under uncertainty, it should diminish as information becomes available. In practice, new evidence against a conspiracy theory is frequently incorporated into the theory as further evidence of the conspiracy's reach.
Sociologist Kathleen Blee, who has conducted long-term interviews with members of extremist movements, has documented how conspiracy beliefs function as social glue within communities, creating shared enemies, shared knowledge, and shared identity. Leaving the conspiracy framework means, in practice, leaving the community, a cost that most people's epistemic commitments cannot match.
Gateway Beliefs and the Escalation Dynamic
Researchers have identified what they call 'gateway beliefs', moderately unusual claims that serve as entry points into a more encompassing conspiratorial worldview. A 2014 study by Stephan Lewandowsky and colleagues found that belief in one conspiracy theory is a strong predictor of belief in others, including theories that contradict each other, suggesting that the conspiracy framework itself, rather than any specific claim, is what spreads.
Gateway beliefs tend to share certain features. They invoke verifiable instances of real institutional wrongdoing, they are difficult to definitively refute without specialised knowledge, and they carry low social cost, since they are widely enough held that believing them does not mark someone as extreme. Once someone has accepted a gateway belief, the reasoning structures used to arrive at it, pattern detection, institutional distrust, resistance to official explanations, apply equally to more extreme claims.
This escalation dynamic is particularly relevant to understanding online radicalisation pathways. Research by Moonshot CVE and others has documented how recommendation algorithms on video platforms routinely direct users from moderately sceptical political content toward progressively more extreme conspiratorial material, because the engagement signals generated by emotionally provocative content are interpreted as interest signals by recommendation systems.
Inoculation Theory: Building Resistance
The most promising intervention research involves what psychologists call 'prebunking' or inoculation. Drawing on William McGuire's 1960s work on resistance to persuasion, Sander van der Linden at Cambridge University and colleagues have developed and tested inoculation-based approaches to conspiracy misinformation.
The approach works by exposing people, before they encounter specific misinformation, to the general rhetorical and logical techniques used to spread it. Rather than attempting to correct specific false beliefs, inoculation builds meta-level awareness of manipulation techniques. A 2017 study by van der Linden, Lewandowsky, and colleagues found that a brief 'inoculation' message describing the techniques used to misrepresent scientific consensus significantly reduced the persuasive impact of a subsequent misleading message about climate science.
The Bad News game, developed by van der Linden's group, puts players in the role of a fake news producer, requiring them to use the techniques of conspiracy-style content creation. Studies of this game found measurable increases in participants' ability to identify misinformation techniques in novel contexts.
The critical limitation of inoculation is timing. Once beliefs are formed, attempting to counter them directly tends to trigger psychological reactance, a motivated increase in belief intensity. Inoculation works before or during belief formation, not after. This is why researchers in this field have increasingly focused on educational interventions for young people rather than on persuading adults who already hold entrenched beliefs.
The Role of Epistemic Cowardice in Information Institutions
One factor that is often underweighted in discussions of conspiracy theories is the role of genuine institutional failures in creating fertile ground for them. Governments have, historically, conducted genuine conspiracies. The Tuskegee study ran for forty years. The CIA's MKULTRA program was real. The tobacco industry's decades-long campaign to obscure the evidence linking smoking to cancer was a sophisticated, coordinated deception. These are not conspiracy theories. They are documented historical facts.
Psychologist Karen Douglas has argued that dismissing conspiracy theories wholesale as irrational, without acknowledging the genuine failures of transparency that create justified scepticism, is itself epistemically dishonest and counterproductive. It conflates the valid insight that authorities sometimes lie with the invalid inference that authorities always lie.
Building genuine institutional transparency, including rigorous whistleblower protections, independent investigative journalism, and accessible scientific communication, is not merely a governance issue but a public health measure. The information ecosystem in which conspiracy theories thrive is partly a consequence of institutions that have, repeatedly and sometimes deliberately, failed to deserve the trust they demand.
Extremism and the Conspiracy Pathway
Research has increasingly documented the relationship between conspiracy belief and political extremism. While conspiracy beliefs do not automatically lead to radicalisation, they appear to lower the psychological barriers to accepting extreme positions and extreme actions by establishing a framework in which conventional political and social norms are already understood as corrupt tools of a malevolent elite.
John Horgan at the University of Massachusetts Lowell, one of the leading researchers on terrorist radicalisation, has found that conspiracy beliefs are a consistent feature of radicalisation pathways across ideologically diverse extremist groups. The mechanism is not that conspiracy beliefs cause radicalisation but that they perform a specific preparatory function: they establish that the stakes are existentially high, that normal political processes are compromised or useless, that the in-group is engaged in a fight for survival against a deceptive, powerful enemy, and that special knowledge and decisive action are required. These elements are consistent prerequisites for the psychological move from discontent to violence.
The online radicalisation research of J.M. Berger, documented in his 2018 book 'Extremism', found that conspiracy content functioned as a reliable recruiting and retention mechanism for extremist communities precisely because of the social identity dynamics it activates: the shared knowledge creates in-group solidarity, the shared enemy creates focus and urgency, and the belief that mainstream culture is systematically deceiving ordinary people creates a justification for seeking alternative communities.
Research on deradicalisation consistently emphasises that simply countering false claims is insufficient once this level of psychological investment has been made. The social and identity needs that extremist communities meet must be addressed through alternative communities and sources of meaning. This parallels the cult research finding that confrontational debunking is less effective than maintaining relationships and addressing underlying needs.
Practical Takeaways
Understanding the psychology of conspiracy belief does not require abandoning scepticism. It requires calibrating scepticism to evidence. Some questions to apply to any claim: Does the evidence come from multiple independent sources? Does the explanation account for the known evidence more simply than alternatives? Does the theory require an implausibly large, leak-proof network of actors? Are the people promoting the theory financially or socially incentivised to promote it?
The goal is not credulity toward institutions but epistemic proportionality: treating uncertain things as uncertain, improbable things as improbable, and confirmed things as confirmed. This is harder than it sounds in an information environment explicitly designed to generate outrage, and it requires ongoing effort and deliberate practice.
The social dimension matters too. Research consistently shows that conspiracy beliefs are more responsive to social influence than to factual argument. Conversations that express genuine curiosity, ask questions rather than presenting counter-evidence, and acknowledge the legitimate concerns that motivate conspiratorial thinking tend to be more effective at shifting beliefs than direct confrontation. This is the same principle that works in cult recovery: maintaining the relationship and the conversation is more valuable than winning any individual argument.
For those working in education, journalism, or public communication, the research clearly supports investment in inoculation-style approaches that build meta-level awareness of manipulation techniques before specific misinformation is encountered. This is the most effective intervention available for populations that have not yet formed strong conspiratorial commitments, and it is considerably more scalable than attempting to correct individual false beliefs after they have formed.
None of this makes individual resistance to conspiracy thinking easy. Human cognition is well-equipped to generate conspiratorial patterns in ambiguous data and poorly equipped to recognise when it is doing so. The most durable protection comes not from a once-acquired critical thinking skill but from ongoing epistemic habits: checking sources as a reflex, maintaining genuine uncertainty when evidence is genuinely uncertain, and remaining curious about why a particular claim is emotionally compelling rather than simply whether it is true. These habits are cultivated over time and require both individual practice and information environments that reward rather than penalise their use.
References
- van Prooijen, J.-W. (2018). 'The Psychology of Conspiracy Theories'. Routledge.
- Brotherton, R. (2015). 'Suspicious Minds: Why We Believe Conspiracy Theories'. Bloomsbury Sigma.
- Shermer, M. (2011). 'The Believing Brain'. Times Books.
- Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013). NASA faked the moon landing -- therefore (climate) science is a hoax. 'Psychological Science', 24(5), 622-633.
- van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. 'Global Challenges', 1(2), 1600008.
- Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. 'Current Directions in Psychological Science', 26(6), 538-542.
- Brotherton, R., & French, C. C. (2014). Belief in conspiracy theories and susceptibility to the conjunction fallacy. 'Applied Cognitive Psychology', 28(2), 238-248.
- Blee, K. M. (2002). 'Inside Organized Racism: Women in the Hate Movement'. University of California Press.
- McGuire, W. J. (1964). Inducing resistance to persuasion. 'Advances in Experimental Social Psychology', 1, 191-229.
- Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. 'Journal of Political Philosophy', 17(2), 202-227.
- Antonetti, P., & Ankers, A. (2016). Exploring conspiracy theory belief as a dispositional trait. 'Personality and Individual Differences', 105, 137-145.
- Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. 'Cognition', 188, 39-50.
Frequently Asked Questions
Are conspiracy theories always wrong?
Not necessarily. Genuine conspiracies exist: Watergate, the Tuskegee syphilis study, and various corporate cover-ups were real conspiracies later verified by evidence. The distinction psychologists draw is between conspiracy theorising as a reasoning process and specific conspiracy theories as claims. The problem is not suspecting that powerful actors sometimes coordinate in secret, which is historically accurate, but the application of a conspiracy framework to every unexplained event regardless of evidence quality. Psychologist Jan-Willem van Prooijen distinguishes between 'healthy scepticism', which is proportional to available evidence, and 'conspiracy mentality', which is a generalised tendency to attribute events to hidden malevolent actors regardless of context.
What is proportionality bias in conspiracy thinking?
Proportionality bias is the intuitive tendency to assume that large events must have large causes. When a significant figure like a president is assassinated, the psychological pressure is toward an explanation of comparable magnitude, a vast conspiracy, rather than a lone, disturbed individual. This bias is not irrational in evolutionary terms: most significant events in ancestral environments were caused by significant agents. But it leads to systematic error in complex modern contexts where large-scale consequences often emerge from small, contingent causes. Researchers have found that proportionality bias is measurably stronger in individuals scoring higher on other measures of conspiracy ideation.
What is 'epistemic anxiety' and how does it drive conspiracy belief?
Epistemic anxiety refers to the distress caused by uncertainty and the inability to explain important events. Psychologists have found that conspiracy beliefs often serve an epistemic function: they replace uncomfortable uncertainty with a clear, if disturbing, explanation. Studies by van Prooijen and colleagues found that anxiety, powerlessness, and unpredictability reliably increased conspiracy ideation in experimental conditions. The paradox is that a disturbing explanation with an identifiable villain can be psychologically preferable to genuine uncertainty, because it implies the world is at least explicable and potentially controllable, even if malevolently so.
What is inoculation theory and does it work against conspiracy theories?
Inoculation theory, developed by William McGuire in the 1960s and extended by Sander van der Linden and colleagues, proposes that pre-emptively exposing people to weakened forms of misinformation and the rhetorical techniques used to spread it can reduce susceptibility to the full-strength version. Research published in 2017 by van der Linden and colleagues found that 'inoculating' participants by explaining manipulation techniques before exposure to conspiracy-style misinformation significantly reduced its persuasive impact. The approach works best before beliefs are formed. Attempting to inoculate people against beliefs they already hold tends to trigger reactance, a defensive doubling down.
Do conspiracy theories cause real harm?
Research documents several categories of measurable harm. Belief in vaccine conspiracy theories is associated with reduced vaccination rates and measurable increases in vaccine-preventable disease outbreaks. Climate conspiracy beliefs reduce support for environmental policy. Medical conspiracy beliefs lead to delays in cancer screening and reduced adherence to treatment protocols. Beyond health outcomes, conspiracy beliefs predict social distrust, reduced civic participation, and in extreme cases have been linked to radicalisation and political violence. Psychologist Karen Douglas has found that exposure to conspiracy theories reduces people's willingness to engage in pro-social behaviours, including voting, even when the conspiracy theory is unrelated to the behaviour in question.