On the morning of November 22, 1963, President John F. Kennedy was shot in Dallas. Within hours, before any official investigation had concluded, alternative explanations were circulating. Within days, they had proliferated into a self-sustaining ecosystem of competing narratives that continues to generate new variants six decades later. The Warren Commission's 1964 conclusion that a single gunman acted alone has never managed to displace the conviction, held by a majority of Americans in most polling conducted since, that a larger conspiracy was responsible. This is striking not because the conspiracy believers are necessarily wrong, but because of what it reveals about how the human mind approaches unexplained events involving powerful actors.

Conspiracy theories are not a marginal pathology. They are a predictable output of cognitive systems that evolved to detect intentional agency, find meaningful patterns, and protect against social betrayal. When those systems operate in high-uncertainty environments with incomplete information and historically justified distrust of authority, conspiracy theorising is not a failure of reason. It is an overdetermined output of normal reasoning processes applied in conditions they were not designed to handle.

The scale of the phenomenon warrants attention. A 2020 study by Cambridge University's Conspiracy and Democracy project, drawing on polling data across seventeen countries, found that belief in at least one conspiracy theory was held by the majority of the adult population in every country surveyed. A 2019 Reuters/Ipsos poll found that 25 percent of Americans believed that "a group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media" — a belief that later became identifiable as a variant of the QAnon conspiracy framework. Conspiracy theories are not fringe phenomena awaiting correction. They are mainstream cognitive events requiring structural explanation.

What researchers have learned over the past two decades is that the question is not really 'why do people believe conspiracy theories', since that question has a relatively clear set of answers. The harder and more important question is what it would take to design information environments and educational frameworks that allow people to maintain appropriate scepticism toward authority without becoming susceptible to the substitution of evidence-free narratives for genuinely uncertain truths.

"Conspiracy theories are not simply false beliefs held by ignorant people. They are the product of motivated reasoning, social identity, and genuine uncertainty, and they cannot be countered by facts alone." — Jan-Willem van Prooijen, 'The Psychology of Conspiracy Theories', 2018


Psychological Need How Conspiracy Theories Address It Research
Epistemic (understanding) Simple causal explanation for complex events Imhoff & Bruder (2014)
Existential (security) Sense of control; identifying threat Whitson & Galinsky (2008)
Social (belonging) In-group of "those who know the truth" Cichocka et al. (2016)
Narcissistic uniqueness Feeling special for seeing what others miss Imhoff & Lamberty (2020)
Proportionality bias Big events must have big causes Leman & Cinnirella (2007)
Distrust of institutions Conspiracies fill gap when institutions fail Levitan & Visser (2015)

Key Definitions

Proportionality bias: The intuitive assumption that events of large significance must have causes of comparable magnitude, leading to resistance against simple or contingent explanations for major events.

Epistemic anxiety: Psychological distress arising from uncertainty and the inability to explain important events, which conspiracy theories can temporarily relieve by providing clear, if false, causal frameworks.

Pattern recognition overdrive: The tendency to perceive meaningful, intentional patterns in random or coincidental data, linked to what cognitive scientists call 'apophenia', the experience of seeing meaningful connections between unrelated things.

Gateway belief: A moderately unusual belief that, once accepted, increases susceptibility to progressively more extreme conspiratorial claims by normalising the conspiratorial reasoning framework.

Inoculation theory: The psychological approach of pre-emptively exposing people to weakened misinformation and the rhetorical techniques used to spread it, to build resistance before full-strength exposure occurs.

Conspiracy mentality: Distinct from belief in any specific conspiracy theory, the conspiracy mentality describes a stable dispositional tendency to interpret events through the lens of hidden malevolent actors, regardless of the specific content of the claim.


The Evolutionary Roots of Conspiratorial Thinking

The cognitive architecture underlying conspiracy belief is not aberrant. It is the product of evolutionary pressures that made pattern detection and agency attribution survival-critical. In ancestral environments, the cost of failing to detect a predator or a hostile social actor was death. The cost of falsely detecting one where none existed was, at most, wasted energy and minor social embarrassment. This asymmetry strongly favoured the evolution of what cognitive scientist Michael Shermer calls 'patternicity', the tendency to find meaningful patterns even in random data.

Alongside patternicity operates what Shermer calls 'agenticity', the tendency to attribute intentional agency to events. Unexplained noises in the darkness, sudden illness, unexpected crop failure: in environments where human and animal agents frequently were responsible for adverse events, the habit of immediately seeking an agent behind every effect was adaptive. These tendencies are not cognitive flaws to be corrected. They are the default settings of a system that worked well for most of human history.

The problem is that these settings are poorly calibrated for a world in which genuinely random processes, complex emergent phenomena, and the uncoordinated actions of millions of independent actors produce large-scale consequences that superficially resemble the fingerprints of intentional planning. Global financial crises, pandemics, and climate change are not produced by conspiracies, but they have many of the surface features that agency-detecting systems associate with intentional coordination.

Research by Stewart, Plotkin, and Mann (2021) published in PNAS attempted to model the evolutionary dynamics of conspiracy belief directly. Their simulation found that in social environments with moderate levels of real deception by powerful actors, heuristics that produce false positives (seeing conspiracies where none exist) conferred fitness advantages over more calibrated belief updating strategies. The implication is sobering: conspiracy ideation is not merely a bug in the cognitive system; in certain social environments, it may have been a feature.

Jan-Willem van Prooijen and the Conspiracy Mentality

Dutch psychologist Jan-Willem van Prooijen has conducted some of the most rigorous empirical research on the psychology of conspiracy belief. His work, summarised in his 2018 book 'The Psychology of Conspiracy Theories', identifies several consistent predictors of conspiracy ideation that cut across specific belief content.

Van Prooijen has found that conspiracy ideation is not primarily a function of specific beliefs but of a generalised 'conspiracy mentality', a stable tendency to explain events through the lens of hidden malevolent actors regardless of evidence quality. This mentality predicts belief in mutually contradictory conspiracy theories simultaneously, which is one of the most diagnostic features of conspiracy ideation as opposed to genuine sceptical inquiry. In one frequently cited study, participants who endorsed one conspiracy theory were significantly more likely to endorse a second theory that was logically incompatible with the first.

His research identifies two primary psychological drivers. The first is epistemic: conspiracy beliefs respond to the need for cognitive closure, the uncomfortable feeling of not knowing. Events that are ambiguous, complex, or genuinely uncertain create pressure toward any explanation that resolves the ambiguity, even a disturbing one. The second driver is existential: conspiracy beliefs respond to feelings of powerlessness and threat by identifying a responsible agent. If bad things are happening because of a conspiracy, the world is at least explicable, and potentially the conspiracy could be stopped.

Van Prooijen and colleagues have found in experimental studies that inducing feelings of anxiety, unpredictability, or low personal control reliably increases conspiracy ideation, even when the conspiracy theory being measured is unrelated to the induced stress. This suggests that conspiracy belief functions partly as a coping mechanism for existential distress.

A particularly striking finding from van Prooijen's research is the relationship between conspiracy belief and societal upheaval. In studies conducted across multiple countries, conspiracy beliefs increased measurably following major collective stressors — economic crises, political instability, public health emergencies. This pattern was observed in real-time during the COVID-19 pandemic: Roozenbeek et al. (2020) found a rapid proliferation of COVID-related conspiracy theories in the first three months of the pandemic, with belief rates significantly higher in populations experiencing greater economic disruption.

Proportionality Bias: When Scale Misleads

One of the most well-documented contributors to conspiracy belief is what psychologist Rob Brotherton, author of 'Suspicious Minds' (2015), calls proportionality bias. The Kennedy assassination is its canonical example. At a cognitive level, an event of the magnitude of a presidential assassination demands an explanation of corresponding scale. A single, politically marginal gunman is not, for most people, a satisfying explanation. The disproportion feels wrong.

This bias is compounded by what statisticians call the base rate neglect of small causes. History is, in fact, full of large consequences produced by small or contingent causes. The assassination of Archduke Franz Ferdinand, which triggered World War I, was attempted and failed multiple times on the same morning before succeeding by chance. The stock market crash of 1929 was produced not by a master plan but by the cascading interactions of thousands of individual panics. The human cognitive system is poorly equipped to accept these kinds of causal structures.

Research by Brotherton and Christopher French (2014) found that proportionality bias was measurably stronger in individuals who scored higher on a generalised conspiracy belief scale, suggesting that it is not merely a situational response to specific events but part of a broader cognitive style.

Leman and Cinnirella (2007) demonstrated proportionality bias experimentally by presenting participants with scenarios in which a large cause (a conspiracy) or a small cause (an individual acting alone) produced either a large or small outcome. Participants rated the large-cause explanation as more credible when the outcome was large, regardless of any additional evidence — a direct measure of the bias operating independently of evidence evaluation.

Social Identity and the Group Function of Conspiracy Beliefs

Beyond individual cognition, conspiracy theories perform important social functions. Research by Manos Antonakis and colleagues has found that conspiracy belief is strongly linked to social identity, specifically to the conviction that one's in-group is under threat from a powerful out-group. Conspiracy theories provide a coherent narrative for that threat and, critically, a flattering account of the in-group as perceptive truth-seekers surrounded by the deceived masses.

This social function explains a pattern that purely epistemic accounts struggle to address: the fact that conspiracy beliefs are often highly resistant to evidence, including evidence that conspiracy believers themselves demanded. If conspiracy belief were purely a function of seeking information under uncertainty, it should diminish as information becomes available. In practice, new evidence against a conspiracy theory is frequently incorporated into the theory as further evidence of the conspiracy's reach.

Sociologist Kathleen Blee, who has conducted long-term interviews with members of extremist movements, has documented how conspiracy beliefs function as social glue within communities, creating shared enemies, shared knowledge, and shared identity. Leaving the conspiracy framework means, in practice, leaving the community, a cost that most people's epistemic commitments cannot match.

Cichocka, Marchlewska, and Golec de Zavala (2016) found that collective narcissism — the belief that one's group is exceptional and insufficiently recognised — was a strong predictor of conspiracy belief across multiple national samples. Groups that feel simultaneously superior and victimised are particularly receptive to conspiracy narratives, which provide an explanation for the gap between their perceived greatness and their experienced status. This finding has implications for the political contexts in which conspiracy beliefs flourish: populist movements that combine claims of exceptional group identity with narratives of elite persecution consistently show elevated conspiracy belief among their supporters.

Gateway Beliefs and the Escalation Dynamic

Researchers have identified what they call 'gateway beliefs', moderately unusual claims that serve as entry points into a more encompassing conspiratorial worldview. A 2014 study by Stephan Lewandowsky and colleagues found that belief in one conspiracy theory is a strong predictor of belief in others, including theories that contradict each other, suggesting that the conspiracy framework itself, rather than any specific claim, is what spreads.

Gateway beliefs tend to share certain features. They invoke verifiable instances of real institutional wrongdoing, they are difficult to definitively refute without specialised knowledge, and they carry low social cost, since they are widely enough held that believing them does not mark someone as extreme. Once someone has accepted a gateway belief, the reasoning structures used to arrive at it, pattern detection, institutional distrust, resistance to official explanations, apply equally to more extreme claims.

This escalation dynamic is particularly relevant to understanding online radicalisation pathways. Research by Moonshot CVE and others has documented how recommendation algorithms on video platforms routinely direct users from moderately sceptical political content toward progressively more extreme conspiratorial material, because the engagement signals generated by emotionally provocative content are interpreted as interest signals by recommendation systems.

Ribeiro et al. (2020), examining YouTube's recommendation system through analysis of 3.4 billion recommendations, found that the platform's algorithm systematically linked moderate political content to more extreme content in its recommendation queue. Users who began with mainstream political commentary were algorithmically guided toward increasingly extreme material across sessions. This is not evidence of intentional radicalisation by platform designers — it is evidence that engagement-maximisation algorithms, operating without explicit goals about belief formation, produce radicalisation as an emergent side effect.

Inoculation Theory: Building Resistance

The most promising intervention research involves what psychologists call 'prebunking' or inoculation. Drawing on William McGuire's 1960s work on resistance to persuasion, Sander van der Linden at Cambridge University and colleagues have developed and tested inoculation-based approaches to conspiracy misinformation.

The approach works by exposing people, before they encounter specific misinformation, to the general rhetorical and logical techniques used to spread it. Rather than attempting to correct specific false beliefs, inoculation builds meta-level awareness of manipulation techniques. A 2017 study by van der Linden, Lewandowsky, and colleagues found that a brief 'inoculation' message describing the techniques used to misrepresent scientific consensus significantly reduced the persuasive impact of a subsequent misleading message about climate science.

The Bad News game, developed by van der Linden's group, puts players in the role of a fake news producer, requiring them to use the techniques of conspiracy-style content creation. Studies of this game found measurable increases in participants' ability to identify misinformation techniques in novel contexts.

A 2022 collaboration between van der Linden's group and Google resulted in the deployment of pre-bunking video advertisements shown before YouTube videos. The study, published in Science Advances, found that exposure to short inoculation videos (between 1 and 2 minutes) increased viewers' ability to identify manipulation techniques in subsequent unrelated content. The effect was observed across partisan lines and persisted in follow-up testing three months later — unusually durable for a brief intervention.

The critical limitation of inoculation is timing. Once beliefs are formed, attempting to counter them directly tends to trigger psychological reactance, a motivated increase in belief intensity. Inoculation works before or during belief formation, not after. This is why researchers in this field have increasingly focused on educational interventions for young people rather than on persuading adults who already hold entrenched beliefs.

The Role of Epistemic Cowardice in Information Institutions

One factor that is often underweighted in discussions of conspiracy theories is the role of genuine institutional failures in creating fertile ground for them. Governments have, historically, conducted genuine conspiracies. The Tuskegee study ran for forty years. The CIA's MKULTRA program was real. The tobacco industry's decades-long campaign to obscure the evidence linking smoking to cancer was a sophisticated, coordinated deception. These are not conspiracy theories. They are documented historical facts.

Psychologist Karen Douglas has argued that dismissing conspiracy theories wholesale as irrational, without acknowledging the genuine failures of transparency that create justified scepticism, is itself epistemically dishonest and counterproductive. It conflates the valid insight that authorities sometimes lie with the invalid inference that authorities always lie.

Uscinski and Parent's 2014 research on historical patterns of conspiracy belief, drawing on a database of letters to the New York Times editor spanning 1890 to 2010, found that conspiracy theory prevalence rises in periods of political crisis and declines in periods of stable, effective governance. This historical pattern suggests that institutional performance is not merely a correlate of conspiracy belief but a genuine cause: institutions that routinely fail their stated purposes create justified grounds for scepticism that get expressed through conspiratorial frameworks when specific explanations are not available.

Building genuine institutional transparency, including rigorous whistleblower protections, independent investigative journalism, and accessible scientific communication, is not merely a governance issue but a public health measure. The information ecosystem in which conspiracy theories thrive is partly a consequence of institutions that have, repeatedly and sometimes deliberately, failed to deserve the trust they demand.

Extremism and the Conspiracy Pathway

Research has increasingly documented the relationship between conspiracy belief and political extremism. While conspiracy beliefs do not automatically lead to radicalisation, they appear to lower the psychological barriers to accepting extreme positions and extreme actions by establishing a framework in which conventional political and social norms are already understood as corrupt tools of a malevolent elite.

John Horgan at the University of Massachusetts Lowell, one of the leading researchers on terrorist radicalisation, has found that conspiracy beliefs are a consistent feature of radicalisation pathways across ideologically diverse extremist groups. The mechanism is not that conspiracy beliefs cause radicalisation but that they perform a specific preparatory function: they establish that the stakes are existentially high, that normal political processes are compromised or useless, that the in-group is engaged in a fight for survival against a deceptive, powerful enemy, and that special knowledge and decisive action are required. These elements are consistent prerequisites for the psychological move from discontent to violence.

The online radicalisation research of J.M. Berger, documented in his 2018 book 'Extremism', found that conspiracy content functioned as a reliable recruiting and retention mechanism for extremist communities precisely because of the social identity dynamics it activates: the shared knowledge creates in-group solidarity, the shared enemy creates focus and urgency, and the belief that mainstream culture is systematically deceiving ordinary people creates a justification for seeking alternative communities.

The January 6, 2021 US Capitol riot provided a real-world case study in the extremism-conspiracy connection. Research by the Program on Extremism at George Washington University, examining the social media activity and stated motivations of arrested participants, found that adherence to QAnon-related beliefs was present in approximately 20 percent of those charged. More broadly, belief that the 2020 election was stolen — a claim promoted through channels that shared structural features with conspiracy communication — was essentially universal among participants, demonstrating how conspiracy frameworks can catalyse collective action without formal organizational structures.

Research on deradicalisation consistently emphasises that simply countering false claims is insufficient once this level of psychological investment has been made. The social and identity needs that extremist communities meet must be addressed through alternative communities and sources of meaning. This parallels the cult research finding that confrontational debunking is less effective than maintaining relationships and addressing underlying needs.

The Digital Information Environment and Conspiracy Acceleration

The structural features of contemporary digital media have created conditions that systematically amplify conspiracy content. Attention-based business models that optimise for emotional engagement, recommendation systems that exploit pattern-seeking tendencies, and the zero marginal cost of producing and distributing misinformation have combined to produce an environment qualitatively different from any previous media ecosystem.

Vosoughi, Roy, and Aral's landmark 2018 study in Science, analyzing 126,000 news stories shared on Twitter over ten years, found that false news spread faster, deeper, and more broadly than true news across all categories examined. False political news showed the most dramatic differential — spreading approximately three times faster than true political news. The mechanism was emotional novelty: false stories were more likely to generate responses of surprise and disgust, which are high-arousal states associated with increased sharing behavior. The platform did not design this outcome, but its architecture selected for it.

Pennycook and Rand (2019), in their work on susceptibility to partisan fake news, found that analytical thinking — measured by Cognitive Reflection Test scores — was associated with lower susceptibility to false news across partisan lines. Importantly, this was not because analytical thinkers were sceptical of all news, but because they were better at discriminating true from false content. The implication: interventions that increase analytical engagement with news content, rather than simply increasing scepticism, are more likely to improve information quality without producing generalised cynicism.

Practical Takeaways

Understanding the psychology of conspiracy belief does not require abandoning scepticism. It requires calibrating scepticism to evidence. Some questions to apply to any claim: Does the evidence come from multiple independent sources? Does the explanation account for the known evidence more simply than alternatives? Does the theory require an implausibly large, leak-proof network of actors? Are the people promoting the theory financially or socially incentivised to promote it?

The goal is not credulity toward institutions but epistemic proportionality: treating uncertain things as uncertain, improbable things as improbable, and confirmed things as confirmed. This is harder than it sounds in an information environment explicitly designed to generate outrage, and it requires ongoing effort and deliberate practice.

The social dimension matters too. Research consistently shows that conspiracy beliefs are more responsive to social influence than to factual argument. Conversations that express genuine curiosity, ask questions rather than presenting counter-evidence, and acknowledge the legitimate concerns that motivate conspiratorial thinking tend to be more effective at shifting beliefs than direct confrontation. This is the same principle that works in cult recovery: maintaining the relationship and the conversation is more valuable than winning any individual argument.

For those working in education, journalism, or public communication, the research clearly supports investment in inoculation-style approaches that build meta-level awareness of manipulation techniques before specific misinformation is encountered. This is the most effective intervention available for populations that have not yet formed strong conspiratorial commitments, and it is considerably more scalable than attempting to correct individual false beliefs after they have formed.

None of this makes individual resistance to conspiracy thinking easy. Human cognition is well-equipped to generate conspiratorial patterns in ambiguous data and poorly equipped to recognise when it is doing so. The most durable protection comes not from a once-acquired critical thinking skill but from ongoing epistemic habits: checking sources as a reflex, maintaining genuine uncertainty when evidence is genuinely uncertain, and remaining curious about why a particular claim is emotionally compelling rather than simply whether it is true. These habits are cultivated over time and require both individual practice and information environments that reward rather than penalise their use.


References

  1. van Prooijen, J.-W. (2018). 'The Psychology of Conspiracy Theories'. Routledge.
  2. Brotherton, R. (2015). 'Suspicious Minds: Why We Believe Conspiracy Theories'. Bloomsbury Sigma.
  3. Shermer, M. (2011). 'The Believing Brain'. Times Books.
  4. Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013). NASA faked the moon landing — therefore (climate) science is a hoax. 'Psychological Science', 24(5), 622-633.
  5. van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. 'Global Challenges', 1(2), 1600008.
  6. Douglas, K. M., Sutton, R. M., & Cichocka, A. (2017). The psychology of conspiracy theories. 'Current Directions in Psychological Science', 26(6), 538-542.
  7. Brotherton, R., & French, C. C. (2014). Belief in conspiracy theories and susceptibility to the conjunction fallacy. 'Applied Cognitive Psychology', 28(2), 238-248.
  8. Blee, K. M. (2002). 'Inside Organized Racism: Women in the Hate Movement'. University of California Press.
  9. McGuire, W. J. (1964). Inducing resistance to persuasion. 'Advances in Experimental Social Psychology', 1, 191-229.
  10. Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. 'Journal of Political Philosophy', 17(2), 202-227.
  11. Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. 'Cognition', 188, 39-50.
  12. Cichocka, A., Marchlewska, M., & Golec de Zavala, A. (2016). Does self-love or self-hate predict conspiracy beliefs? Narcissism, self-esteem, and the endorsement of conspiracy theories. 'Social Psychological and Personality Science', 7(2), 157-166.
  13. Leman, P. J., & Cinnirella, M. (2007). A major event has a major cause: Evidence for the role of heuristics in reasoning about conspiracy theories. 'Social Psychological Review', 9(2), 18-28.
  14. Ribeiro, M. H., et al. (2020). Auditing radicalization pathways on YouTube. 'Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency', 131-141.
  15. Roozenbeek, J., et al. (2020). Susceptibility to misinformation about COVID-19 across 26 countries. 'Royal Society Open Science', 7(10), 201199.
  16. Stewart, A. J., Plotkin, J. B., & Mann, R. P. (2021). Conspiracy theories as quasi-religious ideation. 'Psychological Review', 128(1), 1-19.
  17. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. 'Science', 359(6380), 1146-1151.
  18. Uscinski, J. E., & Parent, J. M. (2014). 'American Conspiracy Theories'. Oxford University Press.
  19. Berger, J. M. (2018). 'Extremism'. MIT Press.
  20. Imhoff, R., & Lamberty, P. (2020). A conspiracy mind: A comparison of conspiracy believers and non-believers. 'European Journal of Social Psychology', 50(3), 540-552.

Frequently Asked Questions

Are conspiracy theories always wrong?

Not necessarily. Genuine conspiracies exist: Watergate, the Tuskegee syphilis study, and various corporate cover-ups were real conspiracies later verified by evidence. The distinction psychologists draw is between conspiracy theorising as a reasoning process and specific conspiracy theories as claims. The problem is not suspecting that powerful actors sometimes coordinate in secret, which is historically accurate, but the application of a conspiracy framework to every unexplained event regardless of evidence quality. Psychologist Jan-Willem van Prooijen distinguishes between 'healthy scepticism', which is proportional to available evidence, and 'conspiracy mentality', which is a generalised tendency to attribute events to hidden malevolent actors regardless of context.

What is proportionality bias in conspiracy thinking?

Proportionality bias is the intuitive tendency to assume that large events must have large causes. When a significant figure like a president is assassinated, the psychological pressure is toward an explanation of comparable magnitude, a vast conspiracy, rather than a lone, disturbed individual. This bias is not irrational in evolutionary terms: most significant events in ancestral environments were caused by significant agents. But it leads to systematic error in complex modern contexts where large-scale consequences often emerge from small, contingent causes. Researchers have found that proportionality bias is measurably stronger in individuals scoring higher on other measures of conspiracy ideation.

What is 'epistemic anxiety' and how does it drive conspiracy belief?

Epistemic anxiety refers to the distress caused by uncertainty and the inability to explain important events. Psychologists have found that conspiracy beliefs often serve an epistemic function: they replace uncomfortable uncertainty with a clear, if disturbing, explanation. Studies by van Prooijen and colleagues found that anxiety, powerlessness, and unpredictability reliably increased conspiracy ideation in experimental conditions. The paradox is that a disturbing explanation with an identifiable villain can be psychologically preferable to genuine uncertainty, because it implies the world is at least explicable and potentially controllable, even if malevolently so.

What is inoculation theory and does it work against conspiracy theories?

Inoculation theory, developed by William McGuire in the 1960s and extended by Sander van der Linden and colleagues, proposes that pre-emptively exposing people to weakened forms of misinformation and the rhetorical techniques used to spread it can reduce susceptibility to the full-strength version. Research published in 2017 by van der Linden and colleagues found that 'inoculating' participants by explaining manipulation techniques before exposure to conspiracy-style misinformation significantly reduced its persuasive impact. The approach works best before beliefs are formed. Attempting to inoculate people against beliefs they already hold tends to trigger reactance, a defensive doubling down.

Do conspiracy theories cause real harm?

Research documents several categories of measurable harm. Belief in vaccine conspiracy theories is associated with reduced vaccination rates and measurable increases in vaccine-preventable disease outbreaks. Climate conspiracy beliefs reduce support for environmental policy. Medical conspiracy beliefs lead to delays in cancer screening and reduced adherence to treatment protocols. Beyond health outcomes, conspiracy beliefs predict social distrust, reduced civic participation, and in extreme cases have been linked to radicalisation and political violence. Psychologist Karen Douglas has found that exposure to conspiracy theories reduces people's willingness to engage in pro-social behaviours, including voting, even when the conspiracy theory is unrelated to the behaviour in question.