In April 1917, the United States entered World War I. Thirty-three years later, in 1950, a Pulitzer Prize-winning journalist named Edward Bernays published a book called Propaganda in which he reflected on the work his uncle — Sigmund Freud — had done in understanding the unconscious, and what it meant for mass communication.
Bernays had been 25 when the US entered the war, working for the US Committee on Public Information — the propaganda bureau created by President Wilson and chaired by journalist George Creel. The CPI's task was transforming a reluctant, isolationist American public into enthusiastic supporters of a European war. In less than two years, they had succeeded. Victory Bond drives oversubscribed. Volunteer military enrollment beyond projections. German-Americans subjected to harassment, their language banned from schools, their sauerkraut renamed "liberty cabbage." Four-minute men delivering pre-scripted speeches in theaters across the country. Posters of German soldiers bayoneting babies.
Bernays drew a disturbing conclusion from his experience: if propaganda could do this — if it could transform a population's beliefs and behaviors in two years — then what it had accomplished was not an exception. It was a template. "The conscious and intelligent manipulation of the organized habits and opinions of the masses," he wrote, "is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power."
His observation was not a warning. It was a business proposal. Bernays became the founder of modern public relations, applying wartime propaganda techniques to consumer products and political causes. The century that followed demonstrated how right he was.
"Propaganda does not deceive people; it merely helps them to deceive themselves." — Aldous Huxley, Brave New World Revisited (1958)
Key Definitions
Propaganda — Systematic communication designed to shape attitudes, beliefs, and behaviors toward predetermined ends through the manipulation of emotional responses, cognitive shortcuts, and social identities, typically while concealing the manipulative intent. Propaganda bypasses rational deliberation rather than engaging it.
Persuasion — Legitimate influence that provides accurate information, valid arguments, and engages the recipient's rational faculties. The distinction from propaganda: persuasion works by enabling better-informed choice; propaganda works by distorting the conditions for choice.
Manufacturing consent — Walter Lippmann's phrase (later developed by Noam Chomsky and Edward Herman) describing how mass media, controlled or shaped by powerful interests, creates the appearance of public agreement with policies that serve elite interests. The public "consents" but does so on the basis of filtered information.
Disinformation — Intentionally false or misleading information disseminated to deceive. Distinct from misinformation (unintentionally false information). The modern "infodemic" involves both disinformation campaigns (deliberate state and non-state actors) and organic spread of misinformation through social networks.
Active measures — Soviet (and now Russian) terminology for covert operations designed to influence foreign countries' internal political processes: disinformation campaigns, agent networks, front organizations, and strategic leaks. The Internet Research Agency's 2016 US election interference was a contemporary active measures operation.
Big Lie (Grosse Lüge) — A propaganda technique attributed to Hitler in Mein Kampf: a lie so audacious that people would not believe that anyone could "so infamously misrepresent the truth." The psychological insight: people who tell small lies assume others also lie; a massive lie may actually be more credible because it seems too outrageous to fabricate.
Astroturfing — Creating the appearance of grassroots popular support for a position or cause that is actually funded and coordinated by a centralized actor (typically a corporation, government, or political campaign). Named for AstroTurf artificial grass — fake grassroots. Social media enables sophisticated astroturfing at scale: networks of fake accounts create the illusion of widespread organic opinion.
Illusory truth effect — Repeated exposure to a claim increases its perceived truthfulness, regardless of whether the claim was originally flagged as false. The mechanism: familiarity (cognitive fluency) feels like truth — familiar things are easier to process, and ease of processing is mistaken for evidence of truth. Propaganda relies heavily on repetition.
Emotional framing — Presenting information in ways that prime particular emotional responses, directing interpretation before analytical processing begins. The same statistics presented as "90% survival rate" vs. "10% mortality rate" activate different emotional responses and produce different decisions (framing effect). Propaganda systematically exploits emotional framing.
Dehumanization — Representing a group as less than fully human — as animals, vermin, disease, or subhuman categories. Dehumanization removes the psychological inhibitions against harming members of the target group. Documented in genocides from the Nazis to Rwanda (Tutsis called "cockroaches" on Radio Mille Collines before and during the genocide).
Tribal epistemology — Evaluating the truth of claims based on the source's group identity rather than the content's evidence. In a tribal epistemological environment, the same fact is accepted or rejected based on whether it seems to serve one's in-group. Effective propaganda exploits tribal epistemology by signaling group membership and attacking out-group credibility.
The History of Modern Propaganda
World War I: The Industrial Birth
Before World War I, mass persuasion was limited by communication technology — pamphlets, newspapers, limited mass media. The war created both the motivation (mobilizing industrial-scale populations for industrial-scale killing) and the infrastructure (new print media, early film, centralized information agencies) for systematic propaganda.
Britain's Wellington House (1914): A secret propaganda bureau, deliberately obscuring its governmental nature, that produced books, pamphlets, and press releases targeting neutral opinion — especially American. When the Lusitania was sunk in 1915 with 1,198 civilians killed, Wellington House's amplification of the story helped shift American opinion. When German soldiers could not be confirmed bayoneting Belgian babies, the story was published anyway.
The Creel Committee (US, 1917): The Committee on Public Information, established six days after Congress declared war, was the first American propaganda agency. Its 150,000 speakers, 100,000 pamphlets, and poster campaigns transformed American public opinion within months. Its Director, George Creel, later wrote a celebratory account: How We Advertised America.
The techniques standardized in WWI — emotional appeals over facts, simple enemy images, appeals to patriotism and fear, suppression of dissent — became the template for all subsequent large-scale propaganda operations.
Nazi Germany: Total Media Control
Joseph Goebbels, appointed Reich Minister of Public Enlightenment and Propaganda in 1933, oversaw the most extensive and intensively studied propaganda apparatus in history. Goebbels understood that propaganda must be felt, not merely understood.
Key features of the Nazi propaganda system:
Total media control: All newspapers, radio, film, and theater required Reich Chamber of Culture membership (denied to Jews and political opponents). Content could not diverge from official messages.
Repetition and simplicity: Complex political realities were reduced to simple slogans and images repeated endlessly. "Ein Volk, Ein Reich, Ein Führer" could be understood by anyone and printed on anything.
Spectacle: The Nuremberg rallies were theatrical productions — choreographed light shows, coordinated crowd movements, architecture scaled to overwhelm individual identity. Leni Riefenstahl's Triumph of the Will (1935) is studied as both propaganda and filmmaking.
Emotional saturation: Every medium was directed at emotional response, not rational persuasion. Fear of enemies (Jews, Communists, foreign powers), pride in German identity, and the visceral excitement of belonging to a mass movement were the emotional substrates.
Dehumanization: Jewish people were systematically represented as a disease, a plague, vermin — imagery present in children's textbooks, newspapers, and films. This dehumanization was a prerequisite for the Holocaust; people will not murder their neighbors without first being taught to see them as not fully human.
The Cold War: Competing Information Regimes
Both superpowers built massive propaganda apparatuses, though they differed in form.
American soft power: Voice of America (1942), Radio Free Europe (1949), and Radio Liberty broadcast news and Western culture into communist bloc countries. The US Information Agency operated libraries and cultural centers worldwide. American consumer culture — Hollywood films, blue jeans, rock music — served as organic propaganda for the American way of life.
Soviet active measures: The KGB's Department A specialized in "active measures" — a more offensive propaganda strategy involving document forgeries, front organizations, agents of influence, and strategic disinformation. The "AIDS was created by the CIA" story originated in Soviet disinformation in 1983; documented in the Mitrokhin Archive. AIDS disinformation spread through African and developing world media and influenced public health responses.
The two approaches differed in scale and methods, but both operated on the principle that controlling what populations believe about international affairs is a form of power.
The Psychological Mechanisms
Fear Appeals
Fear is the most effective single emotion for behavior change — and for suppressing analytical processing. When people are afraid, they seek certainty, authority, and in-group protection. Ambiguity becomes intolerable. Critical evaluation of claims requires cognitive resources that fear deploys elsewhere.
Effective fear appeals follow a specific structure (Witte's Extended Parallel Process Model): identify a threat (your children are in danger), establish its severity and personal relevance (this is happening now, to people like you), then offer a specific response efficacy (do this to be safe). If the fear is activated without a clear response, it produces defensive avoidance — people avoid thinking about it at all.
Propaganda routinely manufactures or exaggerates threats while offering the propagandist's preferred response as the solution.
Us vs. Them Framing
Henri Tajfel's Social Identity Theory demonstrates that humans categorize themselves into groups and derive self-esteem from their group's comparative status. This in-group/out-group psychology is near-universal. Once a group distinction is established, people:
- Favor in-group members even when group assignment is arbitrary (minimal group paradigm)
- Perceive out-group members as more similar to each other (out-group homogeneity)
- Judge identical behaviors more harshly when performed by out-group members
- Accept negative information about out-groups more uncritically
Propaganda systematically exploits this psychology: establish a clear in-group identity (true Americans, real Germans, the people), define an out-group enemy, and frame all events as reflecting on the in-group/out-group conflict. Once the tribal frame is in place, information is processed through it rather than evaluated independently.
The Illusory Truth Effect
In a 1977 study by Hasher, Goldstein, and Toppino, people rated the truth of statements they had seen before as more true than new statements — even when the old statements were labeled "false" on first exposure. Subsequent research has refined this: repeated exposure to false claims increases their perceived truth, particularly for low-knowledge domains where people cannot easily check claims.
The implication for propaganda: lie repeatedly and confidently, and a proportion of the audience will gradually shift toward believing the lie — not through persuasion, but through the psychological mechanism of fluency mistaken for truth.
This is the mechanism behind "firehose of falsehood" disinformation strategies (associated with contemporary Russian information operations): flooding the information environment with contradictory claims produces not belief in any specific claim, but generalized epistemic confusion — if everything might be propaganda, nothing can be trusted.
Motivated Reasoning and Tribal Epistemology
Propaganda is most effective when it aligns with existing values, identities, and interests. People are not passive recipients of information; they are active processors with strong motivated reasoning tendencies.
Research by Kahan, Peters, and colleagues (Cultural Cognition project) demonstrates that people evaluate scientific evidence through the lens of their cultural values: the same data on climate change, gun control, or nuclear power produces different evaluations depending on whether the conclusion would threaten or confirm the person's cultural worldview. Crucially, this effect is stronger among more scientifically literate individuals — they are better at generating reasons to discount unwelcome evidence.
Propaganda that aligns with this dynamic — that frames issues in cultural identity terms, that signals tribal membership — is more effective than propaganda that tries to argue against tribal identity.
Modern Propaganda: Digital Amplification
Social Media Affordances
Contemporary information manipulation operates in an environment of unprecedented scale, targeting precision, and organic amplification:
Micro-targeting: Cambridge Analytica's use of Facebook data to create psychographic profiles (based on OCEAN personality model) for targeted political advertising represented a new level of persuasion precision — different messages for different psychological profiles, at scale.
Troll farms and astroturfing: Russia's Internet Research Agency employed hundreds of people creating fake American social media accounts, Facebook groups, and Twitter profiles that posed as domestic American activists — Black Lives Matter groups, pro-gun groups, anti-immigration groups — amplifying social divisions. Some fake groups organized real-world rallies attended by real Americans who had no idea they were acting on foreign direction.
Algorithmic amplification: Social media algorithms' tendency to amplify emotionally engaging content means that disinformation — typically more emotionally provocative than careful reporting — receives organic amplification without any coordination. Bad actors need only create content that exploits the algorithm's engagement-maximizing logic.
Deepfakes and synthetic media: Advances in AI-generated video and audio are reducing the cost and improving the quality of fabricated media. A video of a political figure saying something they never said — indistinguishable from authentic footage — was technically and financially prohibitive in 2015; it is increasingly accessible in 2024.
The Firehose Strategy
Traditional propaganda aimed to establish specific false beliefs. Russian information operations documented by the Oxford Internet Institute, the Senate Intelligence Committee, and others have employed a different strategy: not establishing specific alternative beliefs, but creating generalized epistemic uncertainty.
By flooding the information environment with contradictory claims, denying obvious facts, and spreading absurd conspiracy theories alongside plausible ones, this strategy leaves audiences unable to distinguish true from false — not because they believe the specific falsehoods, but because they lose confidence in their ability to know anything. Political exhaustion, disengagement, and cynicism are outcomes, even when specific propaganda claims are rejected.
Recognizing and Resisting Propaganda
Intelligence does not protect against propaganda — it may amplify motivated reasoning. The most effective resistance is metacognitive and habitual rather than relying on case-by-case critical evaluation.
Slow down on emotional content: Strong emotional responses — especially anger, fear, and outrage — are often indicators that persuasive pressure is being applied. Before sharing or acting, identify the emotion and ask what it is designed to do.
Ask about the source: Who made this? Who benefits? What is their track record? This is not reflexive cynicism — it is appropriate calibration. Different sources have different incentives and reliability records.
Check for what is missing: Propaganda typically presents only one side. What information would complicate or contradict the message?
The inoculation approach: Research by Sander van der Linden and colleagues shows that exposing people to examples of manipulation techniques, with explanation, before they encounter full-strength propaganda increases resistance. Learning that emotional appeals bypass reasoning makes you more likely to pause when you feel emotional about a claim.
Distinguish the information from the messenger: Tribal epistemology processes information based on source identity. Deliberately evaluating claims from sources you distrust as carefully as claims from sources you trust reduces this vulnerability — though it requires deliberate effort.
For related concepts, see why conspiracy theories spread, how social media algorithms work, and confirmation bias explained.
References
- Bernays, E. L. (1928). Propaganda. Horace Liveright.
- Goebbels, J. (1948). The Goebbels Diaries, 1942-1943. Doubleday. (Taylor, F., Ed.)
- Ellul, J. (1965). Propaganda: The Formation of Men's Attitudes. Vintage Books.
- Chomsky, N., & Herman, E. S. (1988). Manufacturing Consent: The Political Economy of the Mass Media. Pantheon Books.
- Tajfel, H., & Turner, J. C. (1979). An Integrative Theory of Intergroup Conflict. In Austin, W. G., & Worchel, S. (Eds.), The Social Psychology of Intergroup Relations (pp. 33–47). Brooks/Cole.
- Hasher, L., Goldstein, D., & Toppino, T. (1977). Frequency and the Conference of Referential Validity. Journal of Verbal Learning and Verbal Behavior, 16(1), 107–112. https://doi.org/10.1016/S0022-5371(77)80012-1
- Kahan, D. M., et al. (2012). The Polarizing Impact of Science Literacy and Numeracy on Perceived Climate Change Risks. Nature Climate Change, 2(10), 732–735. https://doi.org/10.1038/nclimate1547
- van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the Public against Misinformation about Climate Change. Global Challenges, 1(2). https://doi.org/10.1002/gch2.201600008
- US Senate Select Committee on Intelligence. (2019). Report on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election (Vol. 2). US Government Publishing Office.
- Lippmann, W. (1922). Public Opinion. Harcourt, Brace.
Frequently Asked Questions
What is propaganda and how does it differ from persuasion?
Propaganda is systematic communication designed to shape attitudes, beliefs, and behaviors toward predetermined ends, typically bypassing rational deliberation. It differs from legitimate persuasion in several ways: propaganda exploits psychological vulnerabilities (fear, tribalism, motivated reasoning) rather than engaging reason; it is typically one-sided and suppresses contrary evidence; it aims to create involuntary attitude change rather than informed consent; and it often conceals its source or manipulative intent. The line between sophisticated advertising, political communication, and propaganda is blurry, but the key diagnostic features are concealment of intent and exploitation of psychological vulnerabilities.
What are the main psychological techniques of propaganda?
Key techniques: Fear appeals (exaggerate threat to trigger protective behavior — effective because fear bypasses analytical thinking); Us vs. Them framing (activates in-group/out-group psychology, dehumanizing enemies); Repetition (the 'illusory truth effect' — false claims become more believable with repetition, even knowing they were false initially); Emotional appeals over facts (emotional arousal reduces analytical processing); Appeal to authority or consensus ('everyone knows...', false expert citations); Transfer (associating a cause with respected symbols — flag, religion); and the Big Lie technique (a false claim so audacious that people can't believe someone would fabricate it so boldly).
How was propaganda used historically?
WWI was the first modern industrial propaganda war: the British War Propaganda Bureau (Wellington House) and the US Committee on Public Information (Creel Committee) used posters, films, pamphlets, and staged atrocity stories to maintain wartime morale and recruit allies. Goebbels' Nazi Ministry of Public Enlightenment and Propaganda (1933-45) became the most studied example: total media control, repetitive messaging, spectacle (Nuremberg rallies), and systematic dehumanization of Jews. The Cold War produced massive propaganda apparatuses on both sides — Radio Free Europe, Voice of America, Soviet active measures. Soviet 'active measures' — disinformation campaigns designed to exacerbate Western divisions — are a direct precursor to contemporary Russian information operations.
Why don't intelligent people recognize propaganda?
Intelligence does not protect against propaganda — it may make some forms worse. 'Motivated reasoning' (using analytical ability to rationalize predetermined conclusions) means smart people can generate better justifications for beliefs they're already motivated to hold. Effective propaganda works on pre-existing values and identities, not against them. The illusory truth effect works even when people know a claim was originally labeled false. Tribal epistemology — evaluating information based on source identity (us vs. them) rather than content — is not correlated with intelligence. The most effective defense is metacognitive: actively asking 'why am I being shown this now, what does the source want from me, what is the emotional charge designed to do?'
How does modern social media enable new forms of propaganda?
Social media enables unprecedented precision targeting (Cambridge Analytica psychographic profiling), speed of spread, apparent grassroots authenticity (astroturfing, troll farms mimicking organic opinion), and algorithmic amplification of emotionally engaging (often outrage-based) content. The Internet Research Agency (Russia's St. Petersburg troll farm) created fake American activist groups that organized real physical rallies, demonstrating how social media enables foreign actors to activate domestic divisions at low cost. Deepfakes and AI-generated content are rapidly reducing the cost and increasing the quality of fabricated media.
What is the difference between propaganda and journalism?
Professional journalism ideally discloses its sources, presents contrary evidence, distinguishes fact from opinion, corrects errors, and operates with editorial independence from the subjects it covers. Propaganda conceals its source, suppresses contrary evidence, treats opinion as fact, exploits errors without correcting them, and serves the interests of whoever controls it. In practice, the distinction is a matter of degree — commercial pressures, political leanings, source relationships, and access journalism all push professional journalism toward propaganda features. Media literacy requires evaluating each content on its specific features rather than trusting institutional labels.
How can you recognize and resist propaganda?
Key practices: Slow down when content produces strong emotional reactions — emotional arousal is a mechanism of manipulation, not evidence of truth; check the source (who made this, what is their interest, who funds them?); look for what is absent — propaganda typically only tells one side; look for the specific techniques (dehumanization, false urgency, in-group signaling); seek primary sources rather than relying on summaries; cross-check across outlets with different editorial perspectives. Inoculation research shows that learning about specific manipulation techniques before encountering them builds resistance — understanding that fear appeals bypass analytical thinking makes you more likely to pause when you feel afraid from a piece of content.