Communication Breakdowns: Real-World Examples of How Miscommunication Caused Disasters, Destroyed Spacecraft, and Cost Lives
On January 28, 1986, the Space Shuttle Challenger broke apart 73 seconds after launch, killing all seven crew members. The technical cause was a failure of the O-ring seals in the right solid rocket booster, which allowed hot gases to escape and ignite the external fuel tank. But the deeper cause--the cause that has been studied in organizational behavior, engineering ethics, and communication courses for decades--was a communication breakdown between the engineers who knew the launch was dangerous and the managers who decided to proceed.
The Challenger disaster is the most studied communication failure in modern history because it demonstrates, with tragic clarity, how organizational structures, power dynamics, cultural norms, and communication practices can prevent critical information from reaching the people who need it. The engineers at Morton Thiokol, the contractor that manufactured the solid rocket boosters, knew that the O-ring seals performed poorly at low temperatures. They recommended against launching in the cold conditions forecast for January 28. Their recommendation was overridden through a decision-making process in which communication failed at multiple levels.
Communication failures are not limited to space shuttles. They occur in hospitals, cockpits, boardrooms, construction sites, and ordinary workplaces every day. When communication breaks down, the consequences range from minor misunderstandings to catastrophic disasters--and the patterns that produce these failures are remarkably consistent across domains.
The Challenger Disaster: When Engineers Couldn't Get Heard
What Communication Failure Caused the Challenger Disaster
The communication failure that led to the Challenger disaster operated at multiple levels:
Technical information was not clearly communicated. The Morton Thiokol engineers had data showing that O-ring erosion increased at lower temperatures. However, their presentation of this data to NASA managers was incomplete and confusing. The key chart they presented showed O-ring damage at various launch temperatures but included only launches where damage occurred, not launches where no damage occurred. Including all launches would have shown a much clearer temperature-damage relationship.
Engineers' concerns were not clearly escalated. When Thiokol engineers recommended against launch, NASA managers pushed back. The managers asked Thiokol to "reconsider" their recommendation--a request that, in the context of the organizational relationship between NASA (the customer) and Thiokol (the contractor), carried implicit pressure to change the answer. The engineers maintained their objection, but when Thiokol's management asked for a recess to discuss internally, the engineering recommendation was overridden by management within Thiokol itself.
Organizational hierarchy blocked critical information. The decision-making structure placed the burden of proof on those arguing against launch rather than on those arguing for it. Instead of requiring proof that the launch was safe, the system required proof that it was unsafe--a much higher evidentiary bar. The engineers knew the launch was risky but could not prove with absolute certainty that it would fail. This framing advantage meant that the default action (launch) prevailed over the cautionary recommendation (delay).
Cultural norms suppressed dissent. NASA's organizational culture in the 1980s, shaped by years of successful launches and intense schedule pressure, had developed a norm of treating launch delays as failures. Managers who recommended delays faced career consequences. This cultural pressure created an environment where raising safety concerns was professionally risky, systematically biasing communication toward optimism and away from caution.
The Rogers Commission, which investigated the disaster, concluded that the decision to launch was "flawed" and that "failures in communication" resulted in a decision that did not adequately consider the engineering data. Sociologist Diane Vaughan later termed the phenomenon the "normalization of deviance"--the process by which an organization gradually comes to accept a previously unacceptable level of risk through incremental steps, each of which seems like a small deviation from the norm but which collectively produce dangerous complacency.
The Mars Climate Orbiter: A $125 Million Unit Conversion Error
How Mars Climate Orbiter Failed Due to Communication
On September 23, 1999, NASA's Mars Climate Orbiter was lost as it attempted to enter orbit around Mars. The spacecraft, which had cost $125 million, approached Mars at an altitude of 57 kilometers instead of the planned 226 kilometers and was destroyed by atmospheric friction.
The cause was a unit conversion error. The spacecraft's thruster performance data was produced by Lockheed Martin in English units (pound-force seconds), while NASA's Jet Propulsion Laboratory expected the data in metric units (newton-seconds). The mismatch meant that navigation calculations were systematically wrong throughout the nine-month journey to Mars, with the error accumulating over time.
The communication failure was not that one team used English units and another used metric. The failure was that the assumption of shared standards was never verified. The interface between the two organizations--the point where data produced by one was consumed by the other--lacked explicit specification of units. Each team assumed the other was using the same unit system. Nobody checked.
This type of communication failure--failure to verify assumptions about shared understanding--is among the most common in technical and organizational settings. It occurs whenever two parties believe they are talking about the same thing when they are actually talking about different things, and neither party has a mechanism for detecting the discrepancy.
The Mars Climate Orbiter failure led to significant changes in NASA's communication protocols, including mandatory unit specification in all data interfaces, independent verification of data formats between organizations, and formal interface control documents that explicitly specify every aspect of data exchange between teams.
Healthcare Communication Failures: The 80 Percent Problem
What Healthcare Errors Stem from Poor Communication
Healthcare communication failures are pervasive and deadly. The Joint Commission, the organization that accredits American hospitals, has consistently found that communication failures are the root cause of approximately 80 percent of serious medical errors. These failures include:
Handoff failures. When care of a patient is transferred from one provider to another--during shift changes, transfers between units, or transitions between hospital and home--critical information is frequently lost or distorted. A study by Arora and colleagues found that 25 percent of patients experienced at least one communication error during hospital handoffs, and these errors resulted in adverse events in 9 percent of cases.
Wrong-site surgery. Wrong-site surgery--operating on the wrong body part, the wrong patient, or performing the wrong procedure--occurs an estimated 40 times per week in US hospitals, according to the Joint Commission. These errors almost always involve communication failures: the surgical team fails to verify the patient, the site, or the procedure through the communication protocols designed to prevent exactly this type of error.
Medication errors. Miscommunication about medication names, dosages, routes of administration, and patient allergies is a leading cause of preventable harm in healthcare. Similar drug names (Celebrex vs. Celexa, hydroxyzine vs. hydralazine) create confusion that verbal orders and handwritten prescriptions amplify.
Failure to communicate test results. Critical test results that are not communicated to the ordering physician in a timely manner--or are communicated but not acknowledged--have been identified as a significant source of diagnostic delays and patient harm.
| Communication Failure Type | Domain | Example | Consequence |
|---|---|---|---|
| Hierarchy suppressing dissent | Aviation, space | Challenger O-ring warnings | 7 crew deaths |
| Unverified assumptions | Engineering | Mars Orbiter unit mismatch | $125M spacecraft lost |
| Handoff errors | Healthcare | Shift change information loss | Patient harm, deaths |
| Cultural communication barriers | Aviation | Korean Air hierarchy issues | Multiple crashes |
| Ambiguous messaging | Marketing | Pepsi Points Harrier jet ad | Lawsuit, brand damage |
| Jargon barrier | Cross-functional teams | Technical-business misalignment | Project failures, delays |
Korean Air: When Cultural Hierarchy Prevented Safe Communication
Why Did Korean Air Have High Accident Rates
In the 1990s, Korean Air had a crash rate seventeen times higher than United Airlines. Between 1988 and 1998, the airline experienced multiple fatal accidents at a rate that placed it among the most dangerous major airlines in the world.
The investigation into these accidents, detailed in Malcolm Gladwell's Outliers and in subsequent aviation safety research, identified cultural communication patterns as a significant contributing factor. Korean culture places strong emphasis on hierarchical relationships and differential respect for authority, expressed through linguistic markers (formal vs. informal speech levels) and behavioral norms (deference to seniors, indirect communication of disagreement).
In the cockpit, these cultural norms created dangerous communication dynamics:
Co-pilots could not directly challenge captains. In several Korean Air accidents, co-pilots and flight engineers recognized that the captain was making an error but communicated their concerns using indirect, deferential language that the captain did not interpret as urgent warnings. In the 1997 crash of Korean Air Flight 801 in Guam, the first officer and flight engineer made oblique references to poor weather conditions and the captain's approach, but neither directly stated that the approach was unsafe and should be aborted.
Indirect communication obscured urgency. Saying "Captain, the weather radar is showing precipitation along the approach path" communicates differently than "Captain, we need to go around--this approach is not safe." The first is an observation that the captain can acknowledge and ignore. The second is a direct challenge that demands action. Korean cultural norms made the second form of communication--direct, blunt, challenging authority--extremely difficult for subordinates.
The fix was cultural, not technical. Korean Air's transformation in the 2000s addressed the communication problem directly. The airline adopted English as the language of the cockpit, removing the hierarchical speech levels embedded in the Korean language. Crew Resource Management (CRM) training was implemented aggressively, teaching crew members to communicate safety concerns clearly and directly regardless of rank. First officers were trained and empowered to take control of the aircraft if they believed the captain was making a dangerous error.
The transformation was dramatic. Korean Air's safety record improved to match international standards, and the airline became one of the highest-rated carriers in the world. The change demonstrated that cultural communication patterns, while deeply embedded, can be modified through deliberate training and institutional redesign.
The Pepsi Points Harrier Jet: When Marketing Communication Goes Literal
What Was the Pepsi Points Jet Miscommunication
In 1996, Pepsi ran a promotional campaign offering merchandise in exchange for "Pepsi Points" earned by purchasing Pepsi products. A television commercial featured increasingly valuable items--a t-shirt for 75 points, sunglasses for 175 points, a leather jacket for 1,450 points--culminating in a Harrier jet valued at 7 million points, with the text "HARRIER FIGHTER 7,000,000 PEPSI POINTS" displayed on screen.
John Leonard, a 21-year-old business student, did the math. Pepsi also sold points for 10 cents each. Seven million points at 10 cents each was $700,000. A Harrier jet was worth approximately $23 million. Leonard raised $700,008.50 from investors and submitted an order form requesting the jet.
Pepsi refused. Leonard sued. The case (Leonard v. PepsiCo) went to court, where Pepsi argued that the commercial was obviously a joke--no reasonable person would expect Pepsi to deliver a military fighter jet. The court agreed, ruling that the commercial did not constitute a legally binding offer.
The case illustrates the gap between intended meaning and received meaning in communication. Pepsi intended the Harrier jet as a humorous exaggeration. Leonard (or at least his legal team) interpreted it as a literal offer. The communication failure was not in the words themselves but in the assumption that the intended meaning was self-evident.
How Does Jargon Cause Communication Breakdown?
Jargon--specialized vocabulary used within a particular profession, industry, or group--is a double-edged communication tool. Within a group that shares the same specialized knowledge, jargon increases communication efficiency by packaging complex concepts into single terms. Among software engineers, "refactoring the API to reduce latency" communicates a specific technical action precisely and efficiently.
Between groups that do not share the same specialized knowledge, jargon creates communication barriers:
Creates false sense of understanding. When a technical team tells a business team that the project needs "more capacity in the middleware layer," the business team may nod in understanding without actually comprehending what the statement means, what it would cost, or why it matters. The jargon creates the appearance of communication--words were exchanged, heads were nodded--without the reality of shared understanding.
Hides lack of clarity behind complexity. Jargon can be used (consciously or unconsciously) to obscure rather than clarify. A consultant who says "We need to leverage our core competencies to drive synergistic value creation across the enterprise" has communicated nothing specific. The jargon functions as a barrier that prevents the audience from recognizing that nothing meaningful has been said.
Excludes non-experts from important conversations. When decision-making conversations are conducted in jargon, people without specialized knowledge are effectively excluded from participation, even when their perspectives (customer insight, ethical considerations, practical constraints) are essential to good decisions.
What Role Does Hierarchy Play in Communication Failure?
Hierarchy is one of the most consistent sources of communication failure across organizations and cultures. The dynamics are predictable:
Information does not flow upward. In hierarchical organizations, subordinates have strong incentives to communicate good news upward and filter bad news. The manager wants to report success to their manager, who wants to report success to their manager, creating a chain of optimistic filtering that systematically distorts the information reaching senior leadership.
Dissent is suppressed. When disagreeing with a superior carries professional risk, employees learn to keep their concerns to themselves. This creates a paradox: the people with the most detailed, ground-level knowledge of problems are the people least likely to communicate that knowledge to the people with the authority to address the problems.
Power dynamics distort communication. When a powerful person speaks, their words carry weight beyond their informational content. A CEO who casually mentions a preference is interpreted as a directive. A manager who says "I wonder if we should consider..." is heard as "do this." This amplification effect means that powerful people often do not realize how strongly their casual communications influence behavior.
How Can Organizations Prevent Communication Breakdown?
Flatten Hierarchies for Critical Communication
The most dangerous communication failures occur when hierarchical norms prevent critical information from reaching decision-makers. Solutions include:
Crew Resource Management (CRM). Originally developed for aviation, CRM training teaches team members to communicate safety-critical information clearly and directly, regardless of rank. CRM has been adopted in healthcare (as TeamSTEPPS), nuclear power, and other safety-critical industries with documented improvements in communication and error reduction.
Psychological safety. Amy Edmondson's research on psychological safety--the belief that one can speak up without risk of punishment or humiliation--has demonstrated that teams with high psychological safety have lower error rates, better learning outcomes, and more effective communication than teams where speaking up is risky.
Verify Understanding
The Mars Climate Orbiter failure and countless healthcare errors demonstrate the danger of assuming shared understanding. Solutions include:
Read-back and closed-loop communication. In aviation and healthcare, critical communications are confirmed through read-back: the receiver repeats the message back to the sender to verify understanding. "Administer 10 milligrams of morphine" is followed by "Confirming: 10 milligrams of morphine, intravenous?" This simple protocol catches misunderstandings before they produce errors.
Explicit interface specifications. When teams exchange data, the format, units, conventions, and assumptions should be explicitly specified in interface control documents, not left to assumption.
Use Plain Language
When communication must reach people with diverse backgrounds and expertise levels, plain language is more reliable than jargon:
Say what you mean. Instead of "We need to optimize our talent acquisition pipeline," say "We need to hire people faster." Instead of "There's a resource contention issue in the production environment," say "Two systems are competing for the same memory and one of them is crashing."
Test comprehension. Ask the receiver to explain their understanding of the message in their own words. If their explanation does not match your intent, the communication has failed regardless of how clear you thought you were.
Build Redundancy in Critical Communications
For communications where the cost of failure is high, redundancy provides protection:
Multiple channels. Communicate critical information through multiple channels (written and verbal, electronic and in-person) to reduce the probability that a single channel failure will produce a communication breakdown.
Multiple recipients. Ensure that critical information reaches multiple people, so that the failure of any single individual to receive or act on the information does not produce a system failure.
Confirmation requirements. Require explicit acknowledgment of critical communications, so that the sender knows the information has been received and understood.
Communication breakdown is not primarily a problem of technology or vocabulary. It is a problem of organizational design, cultural norms, power dynamics, and human psychology. The technology for transmitting information--email, messaging, video conferencing, documentation systems--is more capable than at any point in human history. The human capacity for miscommunication, assumption, and hierarchical filtering remains exactly what it has always been. Closing the gap between communication capability and communication effectiveness is a design challenge that requires deliberate attention to the organizational and cultural conditions in which communication occurs.
References and Further Reading
Vaughan, D. (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. University of Chicago Press. https://press.uchicago.edu/ucp/books/book/chicago/C/bo22781921.html
Rogers Commission. (1986). Report of the Presidential Commission on the Space Shuttle Challenger Accident. https://history.nasa.gov/rogersrep/genindex.htm
Gladwell, M. (2008). Outliers: The Story of Success. Little, Brown and Company. https://en.wikipedia.org/wiki/Outliers_(book)
Edmondson, A.C. (2018). The Fearless Organization: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth. Wiley. https://fearlessorganization.com/
The Joint Commission. (2017). "Sentinel Event Data: Root Causes by Event Type." https://www.jointcommission.org/
Arora, V., Johnson, J., Lovinger, D., Humphrey, H.J. & Meltzer, D.O. (2005). "Communication Failures in Patient Sign-Out and Suggestions for Improvement." Quality and Safety in Health Care, 14(6), 401-407. https://doi.org/10.1136/qshc.2005.015107
Salas, E., Wilson, K.A., Burke, C.S. & Wightman, D.C. (2006). "Does Crew Resource Management Training Work?" Human Factors, 48(2), 392-412. https://doi.org/10.1518/001872006777724444
Stephens, D.C. (2004). Mars Climate Orbiter Mishap Investigation Board Phase I Report. NASA. https://science.nasa.gov/mission/mars-climate-orbiter/
Leonard v. PepsiCo, Inc., 88 F. Supp. 2d 116 (S.D.N.Y. 1999). https://en.wikipedia.org/wiki/Leonard_v._Pepsico,_Inc.
Helmreich, R.L. & Merritt, A.C. (1998). Culture at Work in Aviation and Medicine. Ashgate. https://doi.org/10.4324/9781315258690
Grice, H.P. (1975). "Logic and Conversation." In P. Cole & J.L. Morgan (Eds.), Syntax and Semantics, Volume 3: Speech Acts. Academic Press. https://en.wikipedia.org/wiki/Cooperative_principle
Reason, J. (1990). Human Error. Cambridge University Press. https://doi.org/10.1017/CBO9781139062367