In 1971, Herbert Simon — the economist who would win the Nobel Prize in Economics in 1978 — wrote a paragraph that has become one of the most cited in the history of information science: "A wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it."
Simon was writing about organizational management. He was thinking about the problems that arise when executives are flooded with more reports, data, and analysis than they can possibly process. He was not imagining the smartphone. He was not imagining the algorithmic feed, the infinite scroll, or the notification badge engineered to trigger dopamine responses. But he had articulated the structural principle that would come to organize the most powerful companies in the world.
When information becomes abundant, the scarce resource is not information but the attention needed to process it. Whoever controls attention controls outcomes — in commerce, in politics, in culture. This insight is the foundation of the attention economy.
Herbert Simon's Original Concept
Simon's 1971 observation was a structural argument, not a moral one. He was pointing out a systems property: in any system where one resource is abundant and another is constrained, value migrates toward whoever controls the constrained resource. In the pre-digital economy, information was the constrained resource; publishers, broadcasters, and news organizations held power because they controlled distribution of scarce information. In the information-abundant economy, attention becomes the constraint.
The term "attention economy" was explicitly developed in the 1990s and 2000s by writers including Michael Goldhaber, who used it to describe the emerging economics of the internet. Goldhaber's 1997 essay "Attention Shoppers!" argued that the internet economy would be organized around capturing and holding attention, that "star" creators who successfully captured attention would become disproportionately powerful, and that attention was itself becoming a form of currency.
What neither Simon nor Goldhaber anticipated fully was the scale of the engineering investment that would eventually be directed at capturing attention, or the specific behavioral mechanisms that would be deployed.
"The technologies we use to try to make our lives easier are using us and making our lives emptier." -- Douglas Rushkoff, Program or Be Programmed
How Platforms Monetize Attention
The business model of the dominant digital platforms is advertising-based. Google, Meta, TikTok, Twitter/X, and YouTube earn revenue primarily by selling access to their users' attention and behavioral data. The more time users spend on the platform, and the more behavioral signals they generate (clicks, dwell time, shares, reactions), the more the platform can charge advertisers.
This creates a structural incentive that is simple to state and profound in its consequences: every platform design choice that increases engagement increases revenue. Engagement is the core metric around which products are optimized.
The advertising model was not always dominant. The early commercial internet included subscription models, pay-per-use models, and micropayment proposals. The advertising model won because it lowered the friction of access to zero — content is free to users because users are paying with attention rather than money — and because the scale achievable with zero-friction distribution vastly exceeded what any subscription model could reach.
The consequence is that the largest media companies in human history are economically structured to sell human attention, not to serve human interests. These are not the same thing.
The Revenue Scale
To understand the stakes: Google's advertising revenue was approximately $237 billion in 2023. Meta's advertising revenue was approximately $131 billion. Together, two companies earned roughly $368 billion by selling attention — almost entirely the attention of people who were never directly charged and who mostly did not explicitly agree to be the product.
This scale of economic incentive drives an enormous investment in understanding and manipulating human attention. The attention economy is not incidentally extractive; extraction is its primary economic function.
Variable Reward Loops and Behavioral Engineering
The specific mechanisms through which platforms hold attention are drawn from behavioral psychology, particularly the work of B.F. Skinner on operant conditioning.
Skinner's research established that behaviors reinforced by rewards are more persistent when the rewards are variable — delivered unpredictably — than when they are fixed. A rat pressing a lever that sometimes delivers food and sometimes does not presses the lever more often and more persistently than a rat that receives food every time. This is variable ratio reinforcement, and it produces the strongest and most extinction-resistant behavior of any reinforcement schedule.
Slot machines are the most famous application of this principle in human contexts. The pull-lever-wait-for-result cycle is precisely a variable ratio reinforcement schedule, which is why slot machines are among the most addictive devices humans have ever created.
Social media platforms implement the same structure:
- The pull-to-refresh gesture is structurally identical to pulling a slot machine lever
- The like/heart/reaction is a variable reward — sometimes you post something and receive many reactions, sometimes few, and the unpredictability drives repeated checking
- The notification is a discrete reward delivery that interrupts the current behavioral sequence and redirects attention to the platform
Nir Eyal's book "Hooked" (2014) documented this design pattern as a feature for product designers to emulate, laying out a "Hook Model" of trigger, action, variable reward, and investment. Eyal has subsequently written about the ethical responsibilities of designers who use these techniques.
Tristan Harris, a former design ethicist at Google, became a prominent critic of these practices, arguing in his 2017 essay "How Technology is Hijacking Your Mind" that the design choices made by major platforms are specifically optimized to override voluntary choice mechanisms — that users do not freely choose to spend hours on social media but are manipulated into doing so through behavioral engineering they are not aware of.
Content Amplification and the Outrage Premium
Engagement is not neutral. Not all content holds attention equally, and the algorithmic systems that determine which content is amplified are optimized for engagement metrics, not for the wellbeing of users or the accuracy of information.
Research on content virality consistently finds that emotionally arousing content spreads faster and wider than emotionally neutral content — and that negative emotions, particularly outrage, produce the strongest and most durable engagement responses.
A 2018 MIT study by Soroush Vosoughi, Deb Roy, and Sinan Aral, published in Science, analyzed 126,000 news stories on Twitter over eleven years and found that false information spread six times faster than accurate information. The mechanism was not bots; it was human sharing. False information was more novel (it tended to describe surprising, unlikely events) and more emotionally arousing than accurate information. The algorithmic feed, optimized for engagement, amplified the emotionally arousing content — which happened to be disproportionately false.
The political consequences are significant. Content that generates outrage about out-groups — political opponents, cultural adversaries, foreign actors — generates high engagement and is therefore amplified by engagement-optimizing algorithms regardless of its accuracy or its effects on social cohesion.
Facebook's internal research, partially disclosed in the 2021 documents released by whistleblower Frances Haugen, found that the company's algorithmic changes in 2018 designed to increase "meaningful social interactions" had inadvertently increased the spread of misinformation, hate speech, and divisive content — because these categories generated high engagement responses. The company had known this and proceeded anyway.
The Costs: Mental Health, Cognition, and Democracy
Adolescent Mental Health
The most extensively documented cost of the attention economy is its effect on adolescent mental health, particularly for girls. Jean Twenge, a psychologist at San Diego State University, and Jonathan Haidt at New York University have documented a significant increase in adolescent depression, anxiety, loneliness, and self-harm beginning around 2012 — coinciding with the widespread adoption of smartphone social media.
Haidt's 2024 book "The Anxious Generation" synthesized the evidence across multiple data streams:
- US, UK, Canadian, and Australian mental health data showing sharp deterioration beginning around 2012-2013
- Sleep data showing reduced sleep hours and quality in the same period
- Social comparison research showing Instagram's particular harms for appearance-focused social comparison in adolescent girls
- Smartphone adoption data showing an almost precisely coinciding inflection point
The causal relationship remains contested among researchers. Some argue that social media is a symptom of broader social changes rather than a cause. The debate is ongoing, but the correlation is striking, the proposed mechanisms are biologically plausible, and the convergence of evidence across multiple countries and data types is suggestive.
Attention Fragmentation
Research on attention and media multitasking suggests that heavy smartphone and social media use may reduce sustained attention capacity — the ability to focus on a single cognitive task for extended periods. Gloria Mark at the University of California Irvine has studied workplace attention and found that digital interruptions can take up to 23 minutes to fully recover from, and that people who are frequently interrupted begin to interrupt themselves — pre-empting external disruption by checking their devices before notifications arrive.
| Attention Cost | Mechanism | Evidence |
|---|---|---|
| Reduced sustained focus | Habitual context-switching from notifications | Mark et al., multiple workplace studies |
| Impaired episodic memory | Reduced presence/attention during experiences | Henkel (2014) — photo-taking effect |
| Increased anxiety | Comparison, FOMO, uncertain reward cycles | Multiple correlational studies |
| Sleep disruption | Blue light, psychological arousal, late-night use | Several experimental studies |
| Reduced reading depth | Skimming habits transferred from web reading | Mangen, Walgermo & Bronnick (2013) |
Political Fragmentation
The attention economy's effects on political discourse operate through several mechanisms. Outrage-optimized amplification surfaces divisive content. Algorithmic personalization creates filter bubbles in which users primarily encounter content aligned with their existing views. The emotional tenor of political content shifts toward anger and contempt.
Whether social media causes political polarization or merely reflects it is a question researchers continue to debate. What is clearer is that the business model of attention platforms is structurally aligned with the production of engagement — and that political outrage is among the highest-engagement content categories — creating ongoing pressure toward more divisive discourse regardless of any platform's stated values.
What Surveillance Capitalism Is
The attention economy is related to but distinct from what Shoshana Zuboff, in her 2019 book "The Age of Surveillance Capitalism," calls surveillance capitalism — an economic logic in which behavioral data extracted from users is used not only to target advertising but to predict and modify behavior.
Zuboff's argument is that the most advanced platforms do not merely sell access to attention but have developed the capacity to shape behavior through what she calls "instrumentation power" — the ability to know what you will do before you do it and to adjust the informational environment to steer you toward outcomes that benefit the platform's clients.
This goes beyond advertising into territory that has few legal or ethical frameworks to govern it: the systematic use of behavioral prediction to modify choices at scale, invisibly, without the subjects' knowledge or consent.
What Individuals Can Do
The attention economy operates at a structural level that individuals cannot fully escape, but there are practical steps that shift the balance from reactive capture to intentional use.
Reduce the signal surface:
- Turn off all non-essential notifications — alerts that do not require immediate action
- Move social media apps off the home screen (friction reduces impulsive opening)
- Use app timers or scheduled access rather than unlimited availability
Replace algorithmic curation with deliberate curation:
- Subscribe to newsletters, RSS feeds, and podcasts rather than relying on social feeds
- Follow specific people whose content you actively want rather than browsing algorithmic feeds
- Use chronological feeds where available rather than engagement-ranked feeds
Manage attention actively:
- Designate phone-free times (meals, first hour of day, final hour before sleep)
- Keep devices out of the bedroom
- Practice single-tasking: finish one thing before switching
Make usage visible:
- Use screen time tracking to see actual usage versus perceived usage (most people significantly underestimate their phone use)
- Audit which apps consume the most time and whether that time reflects your stated priorities
The goal is not technology abstinence, which is neither realistic nor necessary. The goal is to shift from being a passive subject of algorithmic curation — having your attention captured and directed by systems optimized for engagement metrics — to being a deliberate consumer who makes active choices about what to pay attention to and why.
This shift is harder than it sounds, because the systems arrayed on the other side of it are backed by hundreds of billions of dollars and decades of behavioral research. But the fundamental reorientation — from reactive to deliberate, from algorithmically served to actively chosen — is both possible and consequential.
Frequently Asked Questions
What is the attention economy?
The attention economy is the conceptual framework describing the competition for human attention as the primary form of economic competition in media and digital environments. Coined from Herbert Simon's 1971 observation that 'a wealth of information creates a poverty of attention,' the concept recognizes that when information is abundant, attention becomes scarce — and whoever controls attention controls economic and political power.
How do platforms monetize attention?
Digital platforms earn revenue primarily through advertising, and advertising revenue is proportional to the time users spend on the platform and the behavioral data generated during that time. Every minute of attention is packaged and auctioned to advertisers. This creates a structural incentive to maximize engagement through any means available — which includes content that triggers emotional arousal, outrage, or fear, because such content generates more engagement than content that is calm or neutral.
What is a variable reward loop?
A variable reward loop is a behavioral mechanism drawn from B.F. Skinner's operant conditioning research, in which rewards are delivered unpredictably rather than on a fixed schedule. Variable reinforcement produces stronger and more persistent behavior than predictable reinforcement — a well-documented finding that slot machine designers and social media engineers have both exploited. The unpredictable delivery of likes, notifications, and interesting content creates a pull-to-refresh compulsion that is difficult to resist.
What are the costs of the attention economy?
Documented costs include impaired sustained attention capacity, increased anxiety and depression especially among adolescents, degraded episodic memory for daily experience, and political fragmentation driven by outrage-optimized content amplification. Research by Jean Twenge and Jonathan Haidt links the rise of smartphone social media use, beginning around 2012, with measurable increases in adolescent mental health problems across multiple Western countries.
What can individuals do about the attention economy?
Practical responses include disabling non-essential notifications, using chronological rather than algorithmic feeds where available, designating phone-free periods and spaces, replacing passive scrolling with active content choices (subscriptions, RSS, deliberate searches), and using screen time tracking to make actual usage visible. The goal is not technology abstinence but intentional use: shifting from reactive consumption driven by algorithmic curation to deliberate choices aligned with personal priorities.