In 1951, Solomon Asch ran an experiment so simple and so disturbing that its results have been replicated, extended, and debated for more than seventy years. He gathered groups of eight people in a room, showed them a target line, and asked them to identify which of three comparison lines matched it in length. The correct answer was obvious, the kind of perceptual judgment you could make without hesitation. But seven of the eight participants were actors. And when those seven actors unanimously announced the wrong answer, the real participant agreed with them approximately a third of the time.
The participants who conformed were not confused. They were not poorly educated or cognitively impaired. Many of them, when interviewed afterward, said they had known the group was wrong. They had looked at the lines, seen that the answer was clear, and gone along anyway, to avoid the social discomfort of being the one person in the room who disagreed. A few reported something more unsettling: they had genuinely begun to see the lines as the group described them. Under sustained social pressure, perception itself had bent.
This result captures something that most of us spend significant effort denying: that we are not, in practice, the independent rational agents we imagine ourselves to be. Our beliefs, judgments, and behaviours are continuously shaped by the implicit social norms of the groups we belong to, often without our awareness and often in direct conflict with what we would privately conclude if we could somehow reason in complete social isolation. Understanding this dynamic is not an argument for cynicism about human nature. It is a prerequisite for building institutions and personal habits that protect good thinking from social contamination.
"The tendency to conformity in our society is so strong that reasonably intelligent and well-meaning young people are willing to call white black. This is a matter of concern. It raises questions about our ways of education and about the values that guide our conduct." -- Solomon Asch, 1955
Key Definitions
Normative social influence: Conformity driven by the desire to avoid social rejection, disapproval, or exclusion, which persists even when the individual privately believes the group is wrong.
Informational social influence: Conformity driven by genuine belief updating, using the group's apparent consensus as evidence about the correct answer in ambiguous situations.
| Conformity Type | Driver | Example | Reversibility |
|---|---|---|---|
| Informational conformity | Uncertainty — others seem to know better | Adopting a new social norm when unsure what is correct | Reverses when better information arrives |
| Normative conformity | Social pressure — avoiding rejection or punishment | Agreeing with a group opinion you privately doubt | May persist even when disagreement is possible |
| Internalization | Full belief change — the group's view becomes your own | Genuinely adopting the group's values over time | Difficult to reverse; becomes part of identity |
Groupthink: Irving Janis's 1972 term for the deterioration of rational decision-making in cohesive groups striving for unanimity, producing systematic failures to critically evaluate alternatives.
Pluralistic ignorance: The situation in which most members of a group privately disagree with a norm but publicly comply because they incorrectly assume that others genuinely accept it.
Obedience to authority: The specific form of conformity studied by Stanley Milgram, in which individuals comply with the instructions of a perceived authority figure, overriding their own ethical judgments.
Asch's Experiments: The Architecture of Social Pressure
Solomon Asch's 1951 and 1955 studies were initially designed to refute what he saw as excessively pessimistic conclusions from earlier conformity research by Muzafer Sherif. Sherif had used an ambiguous visual phenomenon, the autokinetic effect, in which a stationary light appears to move in a dark room, to study how people converge on shared judgments under uncertainty. Asch argued that this was too easy: of course people defer to others when they are genuinely uncertain. He wanted to test conformity where the correct answer was unambiguous.
His finding that 37 percent of all responses, across all participants, went along with the obviously incorrect majority was therefore a genuine surprise even to him. The range across individuals was large: about 25 percent of participants never conformed, while about 5 percent conformed on every trial. Most participants fell somewhere between these poles, resisting sometimes and yielding sometimes, with resistance generally stronger in earlier trials and weakening as the pressure continued.
Asch ran numerous variations that illuminate the mechanisms of conformity. When the size of the unanimous majority was varied, conformity increased sharply from one to three actors and then plateaued: a majority of three produced nearly as much conformity as a majority of sixteen. The critical variable was not size but unanimity. When a single confederate gave the correct answer, conformity dropped by roughly three-quarters even if all other confederates continued to give the wrong answer. Social unanimity, not numerical dominance, is what creates maximal conformity pressure.
Post-experimental interviews revealed two distinct psychological routes to conformity. Most participants who conformed reported knowing they were wrong but experiencing intense discomfort at being visibly deviant. A minority reported genuine perceptual distortion. Both routes produce the same behaviour, but they involve different underlying processes. The first is normative influence: yielding to social pressure while maintaining private disagreement. The second is informational influence: updating beliefs based on others' apparent consensus.
Milgram and the Limits of Disobedience
If Asch's experiments revealed how far ordinary people will defer on matters of perception and judgment, Stanley Milgram's obedience studies, conducted at Yale University in the early 1960s, revealed something still more troubling: how far they will defer on matters of ethics.
Milgram's participants were told they were taking part in a study of learning. An actor posing as a fellow participant was strapped into a chair, and the real participant was instructed to administer increasingly powerful electric shocks whenever the actor answered a quiz question incorrectly. The shocks were not real, but participants did not know this. The shock generator included labels ranging from 'Slight Shock' to 'Danger: Severe Shock' and finally 'XXX'. An actor playing an authority figure, a stern experimenter in a lab coat, instructed participants who expressed reluctance to continue.
In Milgram's baseline conditions, approximately 65 percent of participants administered the maximum apparent shock despite the actor's screams and, eventually, ominous silence. This result was consistent across multiple replications in different settings, with different participant populations, and different administrators. When the experiment was replicated at a commercial office building rather than a prestigious Yale laboratory, compliance dropped somewhat but remained high. When two confederates posing as fellow participants refused to continue, compliance dropped dramatically.
Milgram drew several conclusions from these results. The degree of physical proximity to the victim affected compliance: participants who could see the actor showed somewhat lower compliance than those who could only hear screams, who showed lower compliance than those who received no feedback at all. The perceived legitimacy and proximity of the authority figure also mattered. But the overarching finding, reproduced across conditions, was that the majority of ordinary people, when placed in an institutional structure with a clear authority hierarchy and incremental escalation, will participate in acts they privately believe are harmful.
Milgram's interpretive framework, which he called the 'agentic state', proposed that people shift into a mode of psychological operation in which they understand themselves as instruments of another's will rather than autonomous agents. In this state, responsibility feels as though it has been transferred upward to the authority, and individual ethical judgment is suspended.
Groupthink: When Cohesion Becomes Catastrophic
In 1972, social psychologist Irving Janis published 'Victims of Groupthink', his analysis of catastrophic foreign policy decisions including the 1961 Bay of Pigs invasion and the 1941 failure to anticipate the attack on Pearl Harbor. Janis argued that these failures shared a common structure: highly cohesive, intelligent groups making systematically irrational decisions because their drive for consensus overrode their capacity for critical analysis.
Janis identified eight symptoms of groupthink. Illusion of invulnerability leads groups to excessive optimism and risk-taking. Collective rationalisation leads members to discount warnings that challenge group assumptions. Belief in the inherent morality of the group leads members to ignore the ethical dimensions of their decisions. Stereotyped views of out-groups lead to underestimating opponents. Direct pressure on dissenters leads members who do question the consensus to be challenged or marginalised. Self-censorship leads members to withhold doubts to preserve apparent unity. Illusion of unanimity leads to the false impression that consensus is stronger than it actually is. And self-appointed 'mindguards' actively protect the group from information that might disturb its cohesion.
Janis also identified structural antecedents to groupthink: high group cohesion, insulation from outside input, directive leadership, and time pressure. His prescription was specific: leaders should deliberately withhold their own views in initial discussions, 'devil's advocate' roles should be formally assigned, groups should be periodically subdivided to develop independent analyses, and outside experts should be invited to challenge group conclusions.
Subsequent research has both validated and refined Janis's framework. A 2010 meta-analysis by Miron-Spektor and colleagues found strong support for the core groupthink mechanisms. However, researchers including Paul Hart have argued that Janis overestimated the role of cohesion per se, finding that cohesive groups with appropriate norms for dissent do not show groupthink symptoms.
Pluralistic Ignorance: The Invisible Norm
One of the most practically significant conformity phenomena is also one of the least well-known. Pluralistic ignorance describes situations in which nearly everyone privately disagrees with a norm or belief that is publicly maintained, because each person observes others' compliance and infers their genuine agreement.
The concept was developed by Floyd Allport in the 1930s and subsequently refined by Daniel Katz, Richard Schanck, and more recently by Deborah Prentice and Dale Miller. Prentice and Miller's 1993 study at Princeton is particularly instructive. They surveyed undergraduates about their own comfort with drinking and their estimate of peers' comfort. Students systematically underestimated the prevalence of private discomfort with the drinking culture: the private reality was that most students were somewhat uncomfortable, but the public norm of casual ease with heavy drinking was maintained by each individual's assumption that others were genuinely comfortable.
This creates self-fulfilling social dynamics. Each person performs comfort they do not feel, observes others' performed comfort, concludes that their private discomfort is a personal deficiency, and continues to perform comfort. The norm persists without anyone genuinely holding it. Changing pluralistic ignorance requires making private reality visible, which is why anonymous surveys, confidential voting, and protected dissent mechanisms are among the most effective tools for unlocking groups trapped in false consensus.
Pluralistic ignorance is particularly consequential in professional settings. Employees who privately believe a strategy is failing keep quiet because they assume colleagues see something they are missing. Students who do not understand a lecture do not ask questions because no one else appears confused. Medical teams do not challenge a senior physician's judgment because no one else appears to question it. These cascades of silence can have catastrophic consequences, as research on aviation accidents, surgical errors, and corporate fraud has extensively documented.
The Neuroscience of Social Pressure
Research using functional neuroimaging has provided a mechanistic picture of what happens in the brain during conformity. A 2011 study by Vasily Klucharev and colleagues found that disagreement with a group consensus activated the rostral cingulate zone, a region associated with conflict monitoring and error detection. This neural 'conflict signal' was followed by activity in the caudate nucleus, a region involved in reward learning, specifically predicting the value of behavioural adjustments.
In other words, social disagreement is encoded by the brain as a kind of error signal, and conformity is processed as a reward or error-correction. This finding suggests that the discomfort of social deviance is not merely culturally constructed but has a direct neurological basis, consistent with evolutionary models in which social exclusion was genuinely life-threatening and conformity was therefore strongly selected for.
Research by Gregory Berns and colleagues in 2005 found that when participants changed their answers to match the group in a mental rotation task, activity in the visual and parietal cortex actually changed, suggesting that social pressure can produce genuine perceptual modification rather than mere verbal compliance. This is consistent with Asch's own participants who reported genuinely seeing the lines differently after sustained group pressure.
How to Resist Conformity: What Actually Works
The most reliable protection against conformity pressure, according to both Asch's own findings and subsequent research, is the presence of a social ally. Even a single person who dissents, regardless of whether they give the correct answer, dramatically reduces conformity. This suggests that organisations concerned about groupthink should focus less on individual courage and more on structural guarantees that dissent will be heard.
Commitment effects also matter. Research by William Crano and colleagues has found that publicly or privately committing to a position before group discussion reduces subsequent conformity to group pressure. Pre-discussion commitment mechanisms, including written position statements before meetings, can reduce the anchoring effect of early-speaker positions in group decisions.
Awareness of conformity mechanisms provides modest protection. Studies have found that participants briefed on Asch's findings show slightly reduced conformity in similar paradigms, though the effect is considerably smaller than people expect. The subjective experience of independence from social pressure is not a reliable indicator of actual independence.
Structural interventions, including anonymous voting, devil's advocate roles, pre-mortem exercises, and explicit protection of dissenting voices, show more robust effects than individual-level interventions alone. The architecture of decision processes matters more than the individual virtue of participants.
Conformity in Organisational Life and Professional Settings
The practical stakes of conformity research extend well beyond laboratory experiments with line-matching tasks. In professional settings, the costs of groupthink and normative silence can be catastrophic. Research on aviation accidents by Mica Endsley and colleagues has found that many crashes involve a junior crew member who noticed a problem but did not speak up, or spoke up insufficiently forcefully, because of hierarchical deference. The 1977 Tenerife airport disaster, the deadliest aviation accident in history, involved a first officer who raised mild concerns about a takeoff clearance that the captain's authority and time pressure combined to dismiss.
Medical errors follow a similar pattern. A 2004 study by Lucian Leape and colleagues at Harvard Medical School found that nurses who believed a medication order was wrong often failed to challenge it directly. Anesthesiologists and surgeons studying communication failures in the operating theatre have found that hierarchical norms reliably suppress the lateral communication that would catch errors before they become harmful events.
The UK's National Health Service and the US Veterans Affairs system have both implemented structured communication protocols specifically designed to overcome hierarchy-induced conformity in clinical settings. The SBAR (Situation, Background, Assessment, Recommendation) protocol, which gives junior staff a scripted format for raising concerns with senior colleagues, is one such intervention. Studies of its implementation have found reduced rates of adverse events in clinical units that use it consistently compared to those that do not.
The financial industry provides a different and grimly instructive example. Investigations into the 2008 financial crisis have consistently found that risk managers who identified problems with mortgage-backed securities and complex derivative instruments were routinely overruled or marginalised by traders and executives whose status and authority generated normative pressure toward consensus. Janis's groupthink model, developed from political case studies in the 1970s, describes with uncomfortable precision the dynamics of financial institutions that collectively convinced themselves that unprecedented asset price appreciation was sustainable.
These examples suggest that the most important application of conformity research is organisational design: creating structures in which dissent is not merely permitted but actively solicited, in which the costs of speaking up are minimised, and in which the decision-making process itself incorporates structural resistance to the pressure toward premature consensus. Individual moral courage is real and valuable, but it is an insufficient substitute for institutional architecture that makes speaking up safe, expected, and normal. The research is unambiguous on this point: the best time to address conformity pressure is before the decision is being made, not in the moment when social unanimity is already in place and the psychological costs of deviance are already at their peak.
References
- Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgments. In H. Guetzkow (Ed.), 'Groups, Leadership and Men'. Carnegie Press.
- Asch, S. E. (1955). Opinions and social pressure. 'Scientific American', 193(5), 31-35.
- Milgram, S. (1963). Behavioral study of obedience. 'Journal of Abnormal and Social Psychology', 67(4), 371-378.
- Milgram, S. (1974). 'Obedience to Authority: An Experimental View'. Harper & Row.
- Janis, I. L. (1972). 'Victims of Groupthink'. Houghton Mifflin.
- Deutsch, M., & Gerard, H. B. (1955). A study of normative and informational social influences upon individual judgment. 'Journal of Abnormal and Social Psychology', 51(3), 629-636.
- Prentice, D. A., & Miller, D. T. (1993). Pluralistic ignorance and alcohol use on campus. 'Journal of Personality and Social Psychology', 64(2), 243-256.
- Klucharev, V., Hytonen, K., Rijpkema, M., Smidts, A., & Fernandez, G. (2009). Reinforcement learning signal predicts social conformity. 'Neuron', 61(1), 140-151.
- Berns, G. S., Chappelow, J., Zink, C. F., Pagnoni, G., Martin-Skurski, M. E., & Richards, J. (2005). Neurobiological correlates of social conformity and independence during mental rotation. 'Biological Psychiatry', 58(3), 245-253.
- Hart, P. (1991). Irving L. Janis' victims of groupthink. 'Political Psychology', 12(2), 247-278.
- Miron-Spektor, E., Erez, M., & Naveh, E. (2011). The effect of conformist and attentive-to-detail members on team innovation: Reconciling the innovation paradox. 'Academy of Management Journal', 54(4), 740-760.
- Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: Compliance and conformity. 'Annual Review of Psychology', 55, 591-621.
Frequently Asked Questions
What did Solomon Asch's conformity experiments actually find?
In his 1951 experiments, Solomon Asch showed participants a target line alongside three comparison lines of clearly different lengths, then had them state which comparison line matched the target. When actors posing as other participants unanimously gave an obviously wrong answer, real participants agreed with the wrong answer approximately 37 percent of the time. Crucially, interviews afterward revealed that many participants who conformed knew the group was wrong but went along to avoid social friction. Others experienced genuine perceptual distortion, actually seeing the lines as the group claimed. When a single dissenting confederate was introduced, the conformity rate dropped dramatically, demonstrating that social unanimity, not majority size per se, is the critical variable.
What is the difference between normative and informational social influence?
Morton Deutsch and Harold Gerard proposed this distinction in 1955. Normative influence involves conforming to avoid social rejection, disapproval, or punishment regardless of what you privately believe. Informational influence involves conforming because you genuinely update your beliefs based on what others believe, treating their consensus as evidence about reality. Both operate simultaneously in most social situations, making them difficult to disentangle. High-ambiguity situations, where the correct answer is genuinely unclear, increase informational influence. High-stakes social situations with clear correct answers, like Asch's line tasks, tend to produce primarily normative conformity, meaning people know they are wrong but comply anyway.
What is groupthink and how can organisations avoid it?
Psychologist Irving Janis coined the term in 1972 to describe the deterioration of mental efficiency, reality testing, and moral judgment in group members striving for unanimity. He identified it in post-mortems of catastrophic policy failures including the Bay of Pigs invasion and the Pearl Harbor response failure. Symptoms include the illusion of invulnerability, collective rationalisation, stereotyped views of out-groups, self-censorship, and direct pressure on dissenters. Janis recommended specific structural countermeasures including the appointment of a 'devil's advocate' role, leader impartiality during discussions, subdivision of groups for independent analysis, and systematic pre-mortem exercises where groups assume a decision has failed and work backward to identify why.
What is pluralistic ignorance?
Pluralistic ignorance occurs when most members of a group privately reject a norm or belief but publicly comply with it because they incorrectly assume that others genuinely hold it. Each person observes others' public compliance and infers private agreement, when in fact nearly everyone is privately dissenting while publicly conforming. Classic examples include students who privately find a lecture confusing but do not ask questions because no one else appears confused, or employees who privately disagree with a decision but assume their colleagues support it. Research by Deborah Prentice and Dale Miller at Princeton found that pluralistic ignorance about alcohol norms on campuses contributed to excessive drinking because students overestimated peers' comfort with drinking.
How can individuals resist conformity pressure?
Asch's own research provides the clearest answer: the presence of a single ally dramatically reduces conformity. Having a dissenting partner appears to free individuals to act on their private beliefs. Beyond social support, research suggests that commitment to a position before group discussion, explicit awareness of conformity mechanisms, and practice in expressing disagreement respectfully all reduce conformity under pressure. However, research also shows that the desire to resist conformity is not sufficient on its own: people consistently underestimate how strongly social pressure affects them, believing they would behave more independently than they actually do. Structural approaches, such as anonymous voting, written pre-discussion positions, and protected dissent roles, are often more reliable than individual resolve.