How Ethical Failures Actually Happen

The Myth of Bad Apples

Enron collapses. Narrative: Bad people (Ken Lay, Jeffrey Skilling) committed fraud.
Reality: Thousands of employees knew something wrong. Most were normal people. System encouraged, rewarded, and protected fraud.

Wells Fargo scandal. Narrative: Bad employees opened fake accounts.
Reality: Unrealistic sales targets, punishment for dissent, reward for results regardless of methods. Good people did bad things under pressure.

Boeing 737 MAX crashes. Narrative: Bad engineers cut corners.
Reality: Culture prioritizing cost/schedule over safety, management pressuring engineers, board failing to oversee. Systemic failure.

Common belief: Ethical failures happen because bad people do bad things.

Reality: Most ethical failures involve normal people in bad systems.

Understanding how ethical failures happen helps you:

  • Recognize early warning signs
  • Design systems that prevent failure
  • Avoid becoming part of ethical collapse
  • Hold right people/systems accountable

This isn't about excusing misconduct—it's about preventing it by understanding actual mechanisms, not comforting myths about "bad apples."

Core Mechanisms of Ethical Failure

Incremental Compromise (Slippery Slope)

Definition: Gradual erosion of ethical standards through small compromises, each individually justified.

Mechanism:

  1. Minor ethical compromise (seems trivial)
  2. No immediate negative consequences
  3. Small compromise becomes new baseline
  4. Next compromise slightly larger (but relative to new baseline, still "small")
  5. Repeat → Eventually major violations feel normal

Why it works:

  • Each step seems justifiable in context
  • Changes happen slowly (boiling frog)
  • No dramatic line-crossing moment
  • Cognitive dissonance resolved through rationalization

Example - Theranos (Elizabeth Holmes):

  • Start: Overpromise technology capabilities (common in startups)
  • Next: Use fake prototypes in demos (just for investors, will catch up)
  • Next: Use competitors' machines, claim they're proprietary (temporary workaround)
  • Next: Report false test results to patients (numbers will improve, avoiding panic)
  • End: Massive fraud endangering lives

Each step: Rationalized as necessary, temporary, no big deal.
Collective result: Criminal fraud.

Prevention:

  • Clear red lines (non-negotiable boundaries)
  • Regular external audits (fresh eyes catch drift)
  • Periodic ethics reset (explicitly recommit to standards)
  • "Frog out of water" (imagine explaining each step to outsider)

Normalized Deviance

Definition (Diane Vaughan, 1996): Rule violations become routine because nothing bad happened initially, creating false sense of safety.

Mechanism:

  1. Rule exists for safety/ethics
  2. Rule violated (accident, convenience, pressure)
  3. No negative consequence (luck, circumstances)
  4. Violation repeats
  5. Becomes standard practice ("how we do things")
  6. Risk accumulates until catastrophe

Classic example - Space Shuttle Challenger (1986):

  • Design: O-rings seal joints; shouldn't erode
  • Reality: O-rings eroding on flights (rule violation)
  • Response: Each time no explosion → "Acceptable risk"
  • Normalization: Erosion became expected, not alarm
  • Result: Cold temperature (29°F) → O-ring failed → Explosion → 7 astronauts killed

Vaughan's analysis: Engineers knew risk. But incremental normalization made danger feel routine. "We've launched with erosion before; it was fine."

Other examples:

Financial fraud:

  • Minor accounting irregularity
  • Auditors don't catch (or ignore)
  • Becomes practice
  • Irregularities grow
  • Eventually massive fraud (Enron, WorldCom)

Safety violations:

  • Skip safety check (saves time)
  • Nothing happens
  • Becomes routine
  • Eventually: disaster

Prevention:

  • Zero tolerance for critical rules (no normalization allowed)
  • Near-miss reporting (treat close calls as warnings, not luck)
  • External review (internal culture can't see own normalization)
  • Mindset: "We were lucky" not "It's fine"

Moral Disengagement

Definition (Albert Bandura, 1986): Psychological processes allowing people to act unethically without feeling immoral.

Mechanisms:

1. Moral justification: Reframe harmful act as serving higher purpose.

  • Example: "Lying to investors is necessary to save the company (and jobs)."

2. Euphemistic labeling: Use sanitized language to obscure harm.

  • Example: "Enhanced interrogation" (torture), "let go" (fired), "revenue optimization" (price gouging)

3. Advantageous comparison: Compare to worse alternatives to seem less bad.

  • Example: "Everyone else commits tax fraud; we're less egregious."

4. Displacement of responsibility: Blame authority or orders.

  • Example: "I was told to do it; not my decision."

5. Diffusion of responsibility: Many people involved, no one feels responsible.

  • Example: "I just provided data; someone else made the decision."

6. Disregard/distort consequences: Minimize, ignore, or deny harm.

  • Example: "The fake accounts didn't really hurt customers."

7. Dehumanization: View victims as objects, not people.

  • Example: "Customers are just numbers; they don't care."

8. Attribution of blame: Victims deserved it or provoked it.

  • Example: "If they were stupid enough to fall for it, not our fault."

Example - Milgram experiment (1961):

  • Participants administered (fake) electric shocks to people
  • 65% went to maximum voltage (450V, labeled "XXX")
  • Most were uncomfortable but continued when told "you must continue"
  • Moral disengagement: Authority (experimenter) took responsibility; subjects felt "just following orders"

Prevention:

  • Personal accountability (can't hide in collective)
  • Direct exposure to consequences (see harm caused)
  • Language discipline (avoid euphemisms)
  • Encourage dissent (question orders/norms)

Incentive Corruption

Definition: Reward structure encourages unethical behavior.

Mechanism: When doing wrong thing rewarded and doing right thing punished, people comply despite reservations.

Forms:

Rewards for bad behavior:

  • Bonuses for sales (regardless of how achieved)
  • Promotions for results (don't ask about methods)
  • Status for rule-breaking ("get things done")

Punishments for good behavior:

  • Whistleblowers fired or marginalized
  • Those who raise concerns labeled "not team players"
  • Ethical employees miss promotions (slower results)

Example - Wells Fargo:

  • Incentive: Bonuses for accounts opened, risk of being fired for low numbers
  • Pressure: Impossible targets (8 products per customer)
  • Result: 5,300 employees opened 3.5 million fake accounts
  • Why so many complied: Survival (keep job) vs. ethics (don't commit fraud)

Example - Sears auto repair scandal (1992):

  • Incentive: Mechanics paid commission on repairs sold
  • Result: Recommended unnecessary repairs (to earn commission)
  • Why: Mechanic income dependent on upselling

Prevention:

  • Align incentives with values (reward ethical behavior)
  • Remove perverse incentives (don't reward outcomes achieved through wrong means)
  • Balance metrics (not just sales—also ethics, customer satisfaction, long-term health)
  • Protect whistleblowers (reward, don't punish, raising concerns)

Authority and Obedience

Definition: People obey authority figures even when orders conflict with ethics.

Mechanism (Stanley Milgram):

  • Authority perceived as legitimate
  • Responsibility transferred to authority ("they told me to")
  • Social pressure to comply (don't challenge, don't question)
  • Incremental escalation (small steps, each justified)

Factors increasing obedience:

  • Legitimate authority (title, credentials, institution)
  • Physical proximity (authority present)
  • Distance from victim (can't see harm)
  • Gradual escalation (no clear line)
  • Institutional context (official setting)

Example - Abu Ghraib prison abuse (Iraq, 2004):

  • Low-ranking soldiers tortured prisoners
  • Defense: "Following orders" (from intelligence officers, contractors)
  • Reality: Orders may have been implicit or inferred, but authority context enabled abuse
  • Contributing factors: Dehumanization (enemy prisoners), distance from accountability, group dynamics

Example - Corporate whistleblower suppression:

  • Employee raises ethical concern
  • Manager says "Don't worry about it; I'll handle it" (authority)
  • Employee backs down (obedience to authority)
  • Problem festers

Prevention:

  • Question authority (culture encourages dissent)
  • Personal accountability (can't pass all responsibility to authority)
  • Proximity to consequences (see results of actions)
  • Multiple reporting channels (alternative authorities)

Groupthink and Conformity

Definition: Group pressure leads to consensus without critical evaluation, suppressing dissent.

Mechanisms:

Groupthink (Irving Janis, 1972):

  • Illusion of invulnerability (overconfidence)
  • Rationalization (dismiss warnings)
  • Belief in group's morality (we're the good guys)
  • Stereotyping out-groups (critics are fools/enemies)
  • Self-censorship (don't voice doubts)
  • Illusion of unanimity (silence interpreted as agreement)
  • Pressure on dissenters (conform or be excluded)
  • Mindguards (protect group from dissenting information)

Conformity (Solomon Asch, 1951):

  • People conform to group even when group is obviously wrong
  • Social pressure stronger than individual judgment
  • 75% of participants conformed at least once (even for clearly wrong answer)

Example - NASA Challenger decision (1986):

  • Engineers warned: too cold to launch safely
  • Management pressure to launch (political, schedule)
  • Groupthink: "We've launched in cold before" (normalized deviance + groupthink)
  • Dissent suppressed
  • Result: Disaster

Example - Financial crisis (2008):

  • Banks all pursuing same risky strategies (subprime mortgages)
  • Groupthink: "Everyone's doing it; must be safe"
  • Dissenters dismissed as not understanding new paradigm
  • Result: Global financial collapse

Prevention:

  • Encourage dissent (reward devil's advocate role)
  • Anonymous feedback (reduce social pressure)
  • External review (break echo chamber)
  • Leadership models openness (admits doubts, welcomes challenges)

Information Asymmetry and Opacity

Definition: Those making decisions lack information about consequences; those seeing consequences lack power to stop them.

Mechanism:

  • Decision-makers insulated from harm
  • Those affected have no voice
  • Complexity obscures causal links
  • Opacity prevents accountability

Example - Opioid crisis:

  • Pharmaceutical executives: Decided to aggressively market opioids, downplay addiction risk
  • Consequences: Addiction, overdoses, deaths (in different communities)
  • Information gap: Executives didn't see (or chose not to see) devastation
  • Result: Epidemic (500,000+ deaths)

Example - Subprime mortgage crisis:

  • Bankers: Created complex financial products (CDOs, synthetic CDOs)
  • Complexity: Even creators didn't fully understand risk
  • Distance: Bankers far from homeowners losing homes
  • Result: Didn't grasp (or care about) harm until system collapsed

Prevention:

  • Transparency (make information visible)
  • Proximity (decision-makers see consequences)
  • Simplicity (reduce obscuring complexity)
  • Feedback loops (connect actions to outcomes)

Situational Factors

Pressure and Time Constraints

Mechanism: Urgency overwhelms ethical deliberation.

Forms:

  • Financial pressure (must hit targets)
  • Competitive pressure (rivals moving faster)
  • Crisis (no time to think)
  • Career pressure (fear of failure/firing)

Effect:

  • Shortcuts taken
  • Ethical concerns dismissed as luxuries
  • Long-term consequences ignored

Example - Volkswagen emissions scandal:

  • Pressure to compete with hybrids (technology, cost)
  • Can't meet emissions standards without expensive engineering
  • Time/budget pressure
  • Result: Install "defeat device" (cheat on tests)

Prevention:

  • Build slack (don't operate at edge of capacity constantly)
  • Protect long-term thinking (don't sacrifice ethics for speed)
  • Crisis protocols (ethics don't disappear in emergencies)

Competition and Survival

Mechanism: "If we don't, competitor will" or "If I don't, I'll be fired."

Logic:

  • Ethical choice = competitive disadvantage
  • Unethical choice = survival
  • Therefore: unethical choice justified

Problem: Race to bottom (everyone violates ethics to compete).

Example - Tech company data practices:

  • "If we don't collect user data, competitor will"
  • "Users will use competitor; we'll die"
  • Result: All companies violate privacy (collective action problem)

Better framing:

  • Some competition worth losing (if winning requires ethical compromise)
  • Ethical differentiator can be competitive advantage (trust)
  • Collective action (industry standards, regulation) prevents race to bottom

Prevention:

  • Define non-negotiables (won't compromise regardless of competition)
  • Coordinate (industry standards)
  • Regulatory floor (prevents race to bottom)

Isolation and Echo Chambers

Mechanism: Insulated from outside perspective, internal norms drift.

Contributing factors:

  • Geographic isolation (head office far from operations)
  • Social isolation (executives socialize only with each other)
  • Intellectual isolation (dismiss external critics)
  • Cultural isolation (unique organizational culture disconnected from broader norms)

Effect: Internal norms seem reasonable; external perspective would reveal problems.

Example - Enron:

  • Insular culture ("smartest guys in room")
  • Dismissed critics as not understanding
  • Internal norms drifted far from legal/ethical standards
  • By the time collapsed, massive fraud seemed normal internally

Prevention:

  • External board members (outside perspective)
  • Regular rotation (fresh eyes)
  • Engage critics (don't dismiss)
  • Broad reference group (compare to multiple benchmarks, not just peers)

Warning Signs of Ethical Drift

Early indicators (intervene before collapse):

Increasing Rationalization

Sign: More frequent justifications for questionable behavior.

Examples:

  • "Everyone does it"
  • "It's not technically illegal"
  • "Just this once"
  • "Greater good justifies it"
  • "No choice"

What it means: Cognitive dissonance increasing (people know it's wrong but doing it anyway).

Suppressed Dissent

Sign: Those raising concerns ignored, marginalized, or punished.

Examples:

  • Whistleblowers fired
  • Dissenters labeled "not team players"
  • Concerns dismissed without investigation
  • Retaliation against those who speak up

What it means: System protecting itself from accountability.

Metrics Obsession

Sign: Hitting numbers at all costs; means don't matter.

Examples:

  • Celebrate results, don't ask how achieved
  • Fire those who miss targets, ignore ethical violations
  • Manipulation of metrics common and tolerated

What it means: Goodhart's Law (metric becomes target, ceases to be good measure); perverse incentives dominant.

Blame Culture

Sign: Errors punished, not learned from; cover-ups encouraged.

Examples:

  • Shoot messenger
  • Scapegoat individuals, ignore systemic causes
  • Failure not tolerated (so hidden)

What it means: Information won't surface; problems will fester.

Leadership Hypocrisy

Sign: Leaders violate values they espouse; "do as I say, not as I do."

Examples:

  • CEO preaches integrity, commits fraud
  • Executives exempt from rules others must follow
  • Values poster vs. actual practices

What it means: Values are PR, not real; behavior follows leadership example, not stated values.

Minor Violations Ignored

Sign: Small rule-breaking tolerated (normalized deviance beginning).

Examples:

  • Safety shortcuts
  • Accounting irregularities
  • Policy violations

What it means: Standards eroding; major violations likely to follow.

Prevention Strategies

Design Systems for Ethics

Principle: Make doing right thing easy; doing wrong thing hard.

Tactics:

  • Default to ethical (require extra steps to violate)
  • Remove temptation (don't put people in situations requiring heroism)
  • Circuit breakers (automatic stops when thresholds crossed)
  • Transparency (hard to hide misconduct)

Example - Autopilot safety features:

  • System won't allow certain dangerous maneuvers
  • Doesn't rely on pilot choosing not to

Align Incentives

Principle: Reward ethical behavior, not just results.

Tactics:

  • Balance metrics (ethics, long-term, stakeholder welfare—not just profit)
  • Long-term incentives (vest over years, clawback provisions)
  • Punish violations (consequences for unethical behavior, even if profitable)
  • Protect whistleblowers (reward, don't punish)

Maintain Oversight

Principle: Independent eyes watching.

Tactics:

  • Independent board (not captured by management)
  • External audits (not just rubber stamps)
  • Multiple reporting channels (bypasses corrupted hierarchy)
  • Regular ethics audits

Encourage Dissent

Principle: Make raising concerns safe and expected.

Tactics:

  • Psychological safety (no retaliation)
  • Devil's advocate role (formalize dissent)
  • Anonymous channels (reduce social pressure)
  • Leadership models (welcomes challenges)

Model from Top

Principle: Leaders set culture through behavior, not words.

Tactics:

  • Walk the talk (actions match stated values)
  • Admit mistakes (accountability)
  • Visible consequences (leaders not exempt)
  • Consistent values (no exceptions for high performers)

Conclusion

Ethical failures don't require bad people—they require bad systems operating on normal people.

Core mechanisms:

  • Incremental compromise (slippery slope)
  • Normalized deviance (violations become routine)
  • Moral disengagement (psychological distancing from harm)
  • Incentive corruption (rewarded for wrong behavior)
  • Authority and obedience (follow orders despite doubts)
  • Groupthink (conformity suppresses dissent)
  • Information asymmetry (don't see consequences)

Situational factors:

  • Pressure (time, financial, competitive)
  • Isolation (echo chambers)
  • Opacity (complexity hides harm)

Prevention:

  • Design systems that make ethics easy
  • Align incentives with values
  • Maintain independent oversight
  • Encourage and protect dissent
  • Model integrity from leadership

Most important: Recognize you're vulnerable. Believing "I'm ethical; I wouldn't fall for this" makes you more vulnerable. Ethical people do unethical things in bad situations.

The question isn't "Am I good?"—it's "Is this system designed to prevent ethical failure?"

Fix the systems. Don't rely on virtue alone.


Essential Readings

Ethical Failure Patterns:

  • Vaughan, D. (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press. [Normalized deviance]
  • Tenbrunsel, A. E., & Messick, D. M. (2004). "Ethical Fading: The Role of Self-Deception in Unethical Behavior." Social Justice Research, 17(2), 223-236.
  • Bazerman, M. H., & Tenbrunsel, A. E. (2011). Blind Spots: Why We Fail to Do What's Right and What to Do about It. Princeton: Princeton University Press.

Moral Disengagement:

  • Bandura, A. (1986). Social Foundations of Thought and Action. Englewood Cliffs, NJ: Prentice-Hall.
  • Bandura, A. (1999). "Moral Disengagement in the Perpetration of Inhumanities." Personality and Social Psychology Review, 3(3), 193-209.
  • Bandura, A., Barbaranelli, C., Caprara, G. V., & Pastorelli, C. (1996). "Mechanisms of Moral Disengagement in the Exercise of Moral Agency." Journal of Personality and Social Psychology, 71(2), 364-374.

Authority and Obedience:

  • Milgram, S. (1974). Obedience to Authority. New York: Harper & Row. [Classic experiment]
  • Zimbardo, P. (2007). The Lucifer Effect. New York: Random House. [Stanford Prison Experiment; Abu Ghraib analysis]
  • Haney, C., Banks, C., & Zimbardo, P. (1973). "Interpersonal Dynamics in a Simulated Prison." International Journal of Criminology and Penology, 1, 69-97.

Groupthink and Conformity:

  • Janis, I. L. (1982). Groupthink (2nd ed.). Boston: Houghton Mifflin. [Classic analysis]
  • Asch, S. E. (1956). "Studies of Independence and Conformity: A Minority of One Against a Unanimous Majority." Psychological Monographs, 70(9), 1-70.
  • Sunstein, C. R., & Hastie, R. (2015). Wiser: Getting Beyond Groupthink to Make Groups Smarter. Boston: Harvard Business Review Press.

Organizational Ethics:

  • Ashforth, B. E., & Anand, V. (2003). "The Normalization of Corruption in Organizations." Research in Organizational Behavior, 25, 1-52.
  • Palmer, D. (2012). Normal Organizational Wrongdoing. Oxford: Oxford University Press.
  • Darley, J. M. (2005). "The Cognitive and Social Psychology of Contagious Organizational Corruption." Brooklyn Law Review, 70(4), 1177-1194.

Case Studies:

  • McLean, B., & Elkind, P. (2003). The Smartest Guys in the Room: The Amazing Rise and Scandalous Fall of Enron. New York: Portfolio. [Enron]
  • Carreyrou, J. (2018). Bad Blood: Secrets and Lies in a Silicon Valley Startup. New York: Knopf. [Theranos]
  • Robison, P. (2021). Flying Blind: The 737 MAX Tragedy and the Fall of Boeing. New York: Doubleday. [Boeing]
  • Lewis, M. (2010). The Big Short. New York: W. W. Norton. [Financial crisis]

Prevention and Culture:

  • Paine, L. S. (1994). "Managing for Organizational Integrity." Harvard Business Review, 72(2), 106-117.
  • Treviño, L. K., Weaver, G. R., & Reynolds, S. J. (2006). "Behavioral Ethics in Organizations: A Review." Journal of Management, 32(6), 951-990.
  • Gentile, M. C. (2010). Giving Voice to Values: How to Speak Your Mind When You Know What's Right. New Haven: Yale University Press.

Whistleblowing:

  • Near, J. P., & Miceli, M. P. (1985). "Organizational Dissidence: The Case of Whistle-Blowing." Journal of Business Ethics, 4(1), 1-16.
  • Rothschild, J., & Miethe, T. D. (1999). "Whistle-Blower Disclosures and Management Retaliation." Work and Occupations, 26(1), 107-128.

Psychology of Good People Doing Bad Things:

  • Ariely, D. (2012). The (Honest) Truth About Dishonesty. New York: Harper. [Small dishonesty]
  • Gino, F. (2013). Sidetracked: Why Our Decisions Get Derailed, and How We Can Stick to the Plan. Boston: Harvard Business Review Press.