In the spring of 2003, a photographer named Kenneth Adelman was doing unglamorous but important work: flying along the California coastline photographing its cliffs, beaches, and properties for the California Coastal Records Project, an environmental archive documenting erosion. He uploaded the resulting 12,000 images to a public website. The archive was used by scientists, researchers, and occasionally curious members of the public.
One photograph — number 3850 in the archive — showed a large cliffside mansion in Malibu. Before 2003 ended, that photograph would be seen by approximately 420,000 people.
Not because it was interesting. Because Barbara Streisand sued to have it removed.
The Incident That Named a Phenomenon
Streisand's legal team filed a $50 million lawsuit against Adelman and the Coastal Records Project, demanding removal of the image on privacy grounds. The suit was reported by journalists and quickly circulated online. People who had never heard of the Coastal Records Project — and who had no particular interest in aerial photographs of California — suddenly wanted to see exactly what Streisand did not want them to see.
Before the lawsuit, the photograph had been downloaded exactly six times. Two of those downloads were by Streisand's own lawyers.
The California Court of Appeal dismissed the lawsuit. The photograph remained. And in trying to make it invisible, Streisand had made it one of the most viewed images on the internet.
Two years later, blogger and technology journalist Mike Masnick wrote about a similar incident — a singer who had threatened legal action to remove a negative blog post — and reached for the most apt recent example of this dynamic. He called it the Streisand Effect, and the name stuck.
Defining the Streisand Effect
The Streisand Effect is the phenomenon in which attempts to suppress, hide, censor, or remove information cause that information to spread far more widely than it would have if left alone. The suppression attempt itself becomes the story, drawing attention to the very content the suppressor wanted buried.
The effect has several consistent features:
The suppressor has resources: The Streisand Effect typically involves a powerful actor — a corporation, a wealthy individual, a government, a public figure — using legal, technical, or institutional means to remove content. The asymmetry of power makes the suppression attempt itself newsworthy.
The information was previously obscure: If the information was already widely known, suppression cannot amplify it much. The classic Streisand Effect amplifies information that would have remained obscure without the intervention.
The internet provides amplification: Pre-internet, a lawsuit to suppress a newspaper story might succeed in limiting distribution. In the networked environment, a news story about a suppression attempt is itself a story, and it travels to audiences the suppressor cannot reach.
The content is truthful or at least publicly verifiable: Streisand Effects are most dramatic when suppression targets factual information that cannot be retroactively changed, unlike defamatory content that might be corrected.
The Psychology: Why Suppression Creates Desire
The Streisand Effect is not simply irony. It has a robust psychological mechanism: psychological reactance.
In 1966, psychologist Jack Brehm developed reactance theory to explain a consistent observation: when people believe their freedom to access information, make choices, or engage in behavior is being threatened or eliminated, they experience a motivational state characterized by an increased desire for the threatened freedom.
The classic demonstration involved children who were shown two identical toys and told one was unavailable. The children consistently rated the unavailable toy as more desirable than before the restriction was imposed — not because the toy had changed, but because access to it had been restricted.
This mechanism generalizes far beyond toys. When a government bans a book, sales increase. When a health authority recommends against a supplement, demand often spikes. When a company tries to remove an article, the article gets shared more widely. The prohibition signal is, counterintuitively, an advertisement.
Three Compounding Factors
Reactance is the foundation, but three additional factors amplify it in the internet context:
Curiosity about forbidden information: Restriction signals that something important is being hidden. In a world of freely available information, the thing that someone is willing to sue to suppress must be significant. This inference — which is not always correct but is not unreasonable — increases the perceived value of the suppressed content.
The news value of the suppression attempt itself: Journalists are professionally interested in powerful actors attempting to prevent the public from accessing information. The lawsuit or takedown notice is itself a story, independent of the underlying content. This meta-story often reaches far larger audiences than the original content would have.
Network amplification: Each person who shares the story of the suppression attempt reaches their network, who may share further. This cascade effect is asymmetric: once information begins to spread, suppression efforts typically cannot keep pace.
Notable Examples
Carter-Ruck and the Trafigura Affair (2009)
In October 2009, the British law firm Carter-Ruck obtained a super-injunction — a court order preventing not just reporting on a story but preventing any reporting on the existence of the injunction itself — on behalf of the commodities trading company Trafigura. The injunction related to reporting about toxic waste dumping in West Africa.
The Guardian newspaper published a cryptic item noting that a parliamentary question had been suppressed by a legal order but could not, by order of the court, reveal the name of the company involved.
Within hours, the name "Carter-Ruck" and then "Trafigura" were trending on Twitter. Internet users collectively identified the suppressed party and distributed the information the injunction was designed to prevent. The Guardian was released from the injunction the following day. The story received global coverage.
The Trafigura case illustrated something specific about the digital age: a legal system designed to suppress information in a pre-internet era, when the number of distribution points was limited, cannot suppress information in a networked world where millions of individuals are potential publishers.
Sony vs. George Hotz (2011)
In 2010, electronics hacker George Hotz published a jailbreak for the PlayStation 3, enabling users to run unauthorized software. Sony filed a lawsuit against him, including a demand for the IP addresses of everyone who had visited his website — a demand that generated significant public outrage about overreach.
The lawsuit prompted the hacker collective Anonymous to launch a sustained series of cyberattacks against Sony's online services, including the PlayStation Network. The PSN was taken offline for 23 days, affecting approximately 77 million user accounts and costing Sony an estimated $171 million. The Hotz exploit had received modest attention in hobbyist communities. Sony's lawsuit transformed it into a global story and provoked a response that caused far more damage than the original hack.
The BBC and Jimmy McGovern's Script (2020s)
Less dramatic but instructive: content takedowns from platforms like YouTube and Twitter by major media companies regularly produce spikes in search volume for the removed content, with mirrors and reuploads appearing across multiple platforms. The DMCA takedown process has become so familiar that many internet users treat a takedown notice as a recommendation.
The Right to Be Forgotten and Search Results
The European Union's right to be forgotten principle, established in a 2014 EU Court of Justice ruling and codified in GDPR, gives individuals the right to request removal of certain links from search results. When these requests are reported — as they sometimes are in aggregate or in specific high-profile cases — they frequently draw more attention to the suppressed information than the original search result would have. The meta-story about what is being suppressed can reach larger audiences than the suppressed content itself.
| Year | Actor | Action | Result |
|---|---|---|---|
| 2003 | Barbara Streisand | Sued to remove coastal photograph | 420,000 views vs. 6 before lawsuit |
| 2009 | Trafigura/Carter-Ruck | Super-injunction on toxic waste story | Trended on Twitter; global coverage |
| 2011 | Sony | Sued PS3 hacker | 23-day PSN outage; $171M in damages |
| 2015 | Multiple | Right to be forgotten requests | Reported removals drove searches |
| 2016 | UK government | Attempted suppression of Brexit legal advice | Documents leaked and widely shared |
When Suppression Does Work
The Streisand Effect is not universal. Suppression efforts succeed regularly — but their success tends to correlate with specific conditions:
The target audience is small and the suppressor can reach all distribution points: A cease-and-desist to a small local newspaper that has not yet published online may prevent the story from reaching a wider audience. This becomes harder as more distribution points exist.
The content is genuinely defamatory or illegal: Content that platforms will remove on legal grounds (defamation, CSAM, copyright infringement) can be effectively suppressed because every major distribution platform cooperates with valid legal demands. The Streisand Effect applies most strongly to content that is true, embarrassing, but legal.
The suppression is not itself newsworthy: If a removal request is routine, handled quietly, and not worth reporting, it may succeed without amplification. The Streisand Effect requires that the suppression attempt itself is either dramatic enough to be reported or visible enough to create curiosity.
The content's primary distribution channel can be effectively cut: If information exists primarily in one place that can be compelled to remove it, suppression can succeed. In a networked environment with multiple redundant copies, this is increasingly rare.
How to Manage the Risk
For organizations and individuals managing sensitive information, the practical implications of the Streisand Effect suggest several principles:
Assess before acting: Before any suppression attempt, ask: is this information currently obscure? If the answer is yes, suppression may be the primary risk of its spread. If it's already widely known, removal attempts have less amplification potential.
Reserve legal action for genuinely harmful content: The cost-benefit of legal action is most favorable when the content is false, defamatory, or privacy-violating in a legally cognizable way — and when the expected audience without intervention is significant. A $50 million lawsuit over a photograph viewed six times is the textbook example of misjudging this calculus.
Get ahead of the story: When damaging information is likely to emerge, proactive disclosure with context is almost always preferable to reactive suppression. Releasing information yourself, with your framing, typically produces far better outcomes than having it revealed by someone else after a failed suppression attempt.
Use informal channels first: A polite, private request to remove content often succeeds where a legal threat fails — and without the Streisand Effect. Many website operators and journalists will accommodate reasonable requests that do not involve legal escalation.
Seek quiet settlements when appropriate: In defamation and privacy cases, confidential settlements that do not generate public legal filings avoid the amplification that public litigation produces.
Accept that some unflattering information will exist: For individuals and organizations, the internet has permanently changed the ecology of reputation. Attempting to suppress all negative content about yourself is both impossible and counterproductive. The alternative is building enough positive presence that the negative content is contextually less significant.
The Streisand Effect as a Feature of Networked Information
The Streisand Effect reveals something fundamental about the architecture of the internet. In a world where information can be copied perfectly, distributed instantly, and stored permanently at near-zero cost, the tools that powerful actors traditionally used to control information — legal systems, physical control of printing presses, economic influence over media — operate very differently.
A legal system that suppresses information operates at the speed of courts. Information travels at the speed of social networks. The asymmetry is structural, not incidental, and it is not going away.
"The internet treats censorship as damage and routes around it." — John Gilmore, internet activist, 1993 (widely paraphrased)
This does not mean suppression never works or that all information should be freely accessible. It means that the calculus for suppression decisions has fundamentally changed. The question is no longer "can we prevent people from seeing this?" but "will attempting to prevent them from seeing it cause more people to seek it out?" In most cases involving true, embarrassing, widely duplicable information, the answer is yes.
Understanding the Streisand Effect is, at its core, understanding how attention works in networked environments — and why the act of hiding something often becomes the most effective advertisement for it.
Frequently Asked Questions
What is the Streisand Effect?
The Streisand Effect is the phenomenon in which attempts to suppress, remove, or censor information cause that information to spread far more widely than it would have if left alone. The act of suppression draws attention to the very content the suppressor wants hidden. The term was coined by blogger Mike Masnick in 2005, named after a 2003 incident involving Barbara Streisand's attempt to remove an aerial photograph of her home from a coastal survey.
What happened in the original Streisand incident?
In 2003, photographer Kenneth Adelman photographed the California coastline as part of an environmental survey documenting coastal erosion. The archive of 12,000 photographs included an aerial image of a cliffside mansion in Malibu. Barbara Streisand filed a $50 million lawsuit demanding its removal. Before the lawsuit, the image had been downloaded exactly six times, two of which were by Streisand's own lawyers. After the lawsuit was reported, the image was viewed by approximately 420,000 people. The lawsuit was dismissed.
What is the psychology behind the Streisand Effect?
The core mechanism is psychological reactance, a theory developed by psychologist Jack Brehm in 1966. Reactance describes the motivational state that arises when a person perceives their freedom to access information or make choices is being threatened. When told they cannot or should not see something, people often develop an increased desire to see it. This combines with curiosity, the signaling function of prohibition, and the internet's amplification of controversy to make suppression systematically counterproductive.
Can the Streisand Effect be avoided?
The most reliable way to avoid the Streisand Effect is to not attempt suppression of widely accessible public information. For organizations or individuals managing sensitive information, the practical alternatives include getting ahead of a story by releasing information proactively with context, seeking quiet informal resolution before legal threats, or simply accepting that some unflattering information will exist online without amplifying it through removal attempts. Legal action should be reserved for genuinely private or defamatory content, not merely embarrassing truth.
What are some notable corporate Streisand Effect examples?
In 2010, a British oil company (reported as Carter-Ruck) sought a super-injunction preventing any reporting of their name in connection with toxic waste dumping in West Africa, which caused 'Carter-Ruck' to trend on Twitter within hours. In 2011, Sony sued electronics hacker George Hotz over a PlayStation 3 exploit, prompting the hacker collective Anonymous to attack Sony's network repeatedly. The lawsuit drew far more attention to the exploit than it would otherwise have received. Both cases illustrate how legal suppression can transform minor stories into major ones.