In 1971, Herbert Simon - the economist who would win the Nobel Prize in Economics in 1978 - wrote a paragraph that has become one of the most cited in the history of information science: "A wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it."
Simon was writing about organizational management. He was thinking about the problems that arise when executives are flooded with more reports, data, and analysis than they can possibly process. He was not imagining the smartphone. He was not imagining the algorithmic feed, the infinite scroll, or the notification badge engineered to trigger dopamine responses. But he had articulated the structural principle that would come to organize the most powerful companies in the world.
When information becomes abundant, the scarce resource is not information but the attention needed to process it. Whoever controls attention controls outcomes - in commerce, in politics, in culture. This insight is the foundation of the attention economy.
Herbert Simon's Original Concept
Simon's 1971 observation was a structural argument, not a moral one. He was pointing out a systems property: in any system where one resource is abundant and another is constrained, value migrates toward whoever controls the constrained resource. In the pre-digital economy, information was the constrained resource; publishers, broadcasters, and news organizations held power because they controlled distribution of scarce information. In the information-abundant economy, attention becomes the constraint.
The term "attention economy" was explicitly developed in the 1990s and 2000s by writers including Michael Goldhaber, who used it to describe the emerging economics of the internet. Goldhaber's 1997 essay "Attention Shoppers!" in Wired argued that the internet economy would be organized around capturing and holding attention, that "star" creators who successfully captured attention would become disproportionately powerful, and that attention was itself becoming a form of currency that would rival money in its organizing power.
Thomas Davenport and John Beck brought these ideas into business strategy with their 2001 book The Attention Economy: Understanding the New Currency of Business, which provided the first systematic framework for understanding attention as a manageable organizational resource. They argued that attention was becoming as important to manage as financial capital - and far harder to measure.
What neither Simon nor Goldhaber anticipated fully was the scale of the engineering investment that would eventually be directed at capturing attention, or the specific behavioral mechanisms that would be deployed. In 1997, Google did not exist. Facebook was seven years away. TikTok was twenty years away. The full industrial machinery for harvesting human attention was still being built.
"The technologies we use to try to make our lives easier are using us and making our lives emptier." - Douglas Rushkoff, Program or Be Programmed (2010)
How Platforms Monetize Attention
The business model of the dominant digital platforms is advertising-based. Google, Meta, TikTok, Twitter/X, and YouTube earn revenue primarily by selling access to their users' attention and behavioral data. The more time users spend on the platform, and the more behavioral signals they generate (clicks, dwell time, shares, reactions), the more the platform can charge advertisers.
This creates a structural incentive that is simple to state and profound in its consequences: every platform design choice that increases engagement increases revenue. Engagement is the core metric around which products are optimized.
The advertising model was not always dominant. The early commercial internet included subscription models, pay-per-use models, and micropayment proposals. The advertising model won because it lowered the friction of access to zero - content is free to users because users are paying with attention rather than money - and because the scale achievable with zero-friction distribution vastly exceeded what any subscription model could reach.
The consequence is that the largest media companies in human history are economically structured to sell human attention, not to serve human interests. These are not the same thing.
The Revenue Scale
To understand the stakes: Google's advertising revenue was approximately $237 billion in 2023. Meta's advertising revenue was approximately $131 billion. Together, two companies earned roughly $368 billion by selling attention - almost entirely the attention of people who were never directly charged and who mostly did not explicitly agree to be the product.
TikTok, despite its relative youth, generated an estimated $16.1 billion in advertising revenue globally in 2023, with projections showing rapid growth toward parity with older platforms. The speed of TikTok's growth is itself a function of its attention capture efficiency: the platform's algorithm is widely acknowledged by industry observers and former employees as the most effective attention-retention system ever built at consumer scale.
This scale of economic incentive drives an enormous investment in understanding and manipulating human attention. The attention economy is not incidentally extractive; extraction is its primary economic function.
The Advertising Auction System
The specific mechanism by which platforms convert attention to revenue is the real-time advertising auction. When a user loads a page or opens an app, a programmatic auction runs in milliseconds, with advertisers bidding for the opportunity to show that specific user an advertisement at that specific moment. The auction accounts for user demographics, behavioral data, the predicted likelihood that the user will respond to the ad, and the current competitive landscape.
This system means that platforms are not simply selling general audiences - they are selling access to specific behavioral moments. A user who has been researching car purchases, whose behavioral data indicates high intent, at a moment of high engagement on the platform, commands a premium price in the auction. The platform's incentive is to know as much as possible about each user's current state of mind and to keep users on the platform at high-attention moments.
Variable Reward Loops and Behavioral Engineering
The specific mechanisms through which platforms hold attention are drawn from behavioral psychology, particularly the work of B.F. Skinner on operant conditioning.
Skinner's research established that behaviors reinforced by rewards are more persistent when the rewards are variable - delivered unpredictably - than when they are fixed. A rat pressing a lever that sometimes delivers food and sometimes does not presses the lever more often and more persistently than a rat that receives food every time. This is variable ratio reinforcement, and it produces the strongest and most extinction-resistant behavior of any reinforcement schedule. Skinner documented this in his landmark 1938 work The Behavior of Organisms and refined the finding across decades of subsequent research.
Slot machines are the most famous application of this principle in human contexts. The pull-lever-wait-for-result cycle is precisely a variable ratio reinforcement schedule, which is why slot machines are among the most addictive devices humans have ever created. The gaming industry has known this for decades, and regulators have developed frameworks for managing the associated harms.
Social media platforms implement the same structure with no equivalent regulatory framework:
- The pull-to-refresh gesture is structurally identical to pulling a slot machine lever
- The like/heart/reaction is a variable reward - sometimes you post something and receive many reactions, sometimes few, and the unpredictability drives repeated checking
- The notification is a discrete reward delivery that interrupts the current behavioral sequence and redirects attention to the platform
- The algorithmic feed varies in content quality unpredictably, ensuring that users scroll further to find the next engaging item rather than stopping at a predictable point
Nir Eyal's book Hooked (2014) documented this design pattern as a feature for product designers to emulate, laying out a "Hook Model" of trigger, action, variable reward, and investment. Eyal has subsequently written about the ethical responsibilities of designers who use these techniques, publishing Indistractable (2019) as a partial corrective.
Tristan Harris, a former design ethicist at Google, became a prominent critic of these practices, arguing in his 2017 essay "How Technology is Hijacking Your Mind" that the design choices made by major platforms are specifically optimized to override voluntary choice mechanisms - that users do not freely choose to spend hours on social media but are manipulated into doing so through behavioral engineering they are not aware of. Harris later co-founded the Center for Humane Technology and testified before the US Senate about these practices in 2019.
The Infinite Scroll
Aza Raskin, a product designer who worked at Mozilla and Jawbone, invented the infinite scroll - the mechanism by which content feeds load new content automatically as the user reaches the bottom, eliminating natural stopping points. Raskin has subsequently expressed regret about the invention, estimating in interviews that infinite scroll causes approximately 200,000 additional hours of social media use per day globally.
The infinite scroll is a direct application of attention engineering: natural stopping points (page bottoms, "load more" buttons) create decision moments at which users might choose to stop. Eliminating those decision moments eliminates the opportunity to stop, keeping users in the engagement loop indefinitely.
Content Amplification and the Outrage Premium
Engagement is not neutral. Not all content holds attention equally, and the algorithmic systems that determine which content is amplified are optimized for engagement metrics, not for the wellbeing of users or the accuracy of information.
Research on content virality consistently finds that emotionally arousing content spreads faster and wider than emotionally neutral content - and that negative emotions, particularly outrage, produce the strongest and most durable engagement responses.
A landmark 2018 study by Soroush Vosoughi, Deb Roy, and Sinan Aral, published in Science, analyzed 126,000 news stories on Twitter over eleven years and found that false information spread six times faster than accurate information. The mechanism was not bots; it was human sharing. False information was more novel (it tended to describe surprising, unlikely events) and more emotionally arousing than accurate information. The algorithmic feed, optimized for engagement, amplified the emotionally arousing content - which happened to be disproportionately false.
The political consequences are significant. Content that generates outrage about out-groups - political opponents, cultural adversaries, foreign actors - generates high engagement and is therefore amplified by engagement-optimizing algorithms regardless of its accuracy or its effects on social cohesion.
Facebook's internal research, partially disclosed in the 2021 documents released by whistleblower Frances Haugen, found that the company's algorithmic changes in 2018 designed to increase "meaningful social interactions" had inadvertently increased the spread of misinformation, hate speech, and divisive content - because these categories generated high engagement responses. The company had known this and proceeded anyway, a decision that became central to the congressional hearings Haugen's testimony triggered.
A follow-up study by William Brady and colleagues at NYU (2017), published in Proceedings of the National Academy of Sciences, found that each moral-emotional word added to a tweet increased its retweet rate by approximately 20 percent. The study identified a direct, measurable link between moralized emotional framing and algorithmic amplification - meaning that platforms optimized for engagement are structurally incentivized to amplify moral outrage specifically, not merely emotional content generally.
The Costs: Mental Health, Cognition, and Democracy
Adolescent Mental Health
The most extensively documented cost of the attention economy is its effect on adolescent mental health, particularly for girls. Jean Twenge, a psychologist at San Diego State University, and Jonathan Haidt at New York University have documented a significant increase in adolescent depression, anxiety, loneliness, and self-harm beginning around 2012 - coinciding with the widespread adoption of smartphone social media.
Haidt's 2024 book The Anxious Generation synthesized the evidence across multiple data streams:
- US, UK, Canadian, and Australian mental health data showing sharp deterioration beginning around 2012-2013
- Emergency department data showing self-harm hospitalizations among girls aged 10-14 more than doubled between 2009 and 2019
- Sleep data showing reduced sleep hours and quality in the same period
- Social comparison research showing Instagram's particular harms for appearance-focused social comparison in adolescent girls
- Smartphone adoption data showing an almost precisely coinciding inflection point
The causal relationship remains contested among researchers. Some argue that social media is a symptom of broader social changes rather than a cause. Candice Odgers at UC Irvine, among the most prominent critics of the causal narrative, has argued in Nature that the methodological quality of most studies in this space is too low to support confident causal conclusions. The debate is ongoing, but the correlation is striking, the proposed mechanisms are biologically plausible, and the convergence of evidence across multiple countries and data types is suggestive.
Attention Fragmentation
Research on attention and media multitasking suggests that heavy smartphone and social media use may reduce sustained attention capacity - the ability to focus on a single cognitive task for extended periods.
Gloria Mark at the University of California Irvine has studied workplace attention across two decades of field research. Her book Attention Span (2023) synthesized findings showing that the average length of time people focused on a single screen fell from approximately 2.5 minutes in 2004 to 47 seconds by 2020. This is not simply a technology effect - Mark documents multiple causes - but digital interruptions are a significant contributor. Her research found that digital interruptions can take up to 23 minutes to fully recover from cognitively, and that people who are frequently interrupted begin to interrupt themselves - pre-empting external disruption by checking their devices before notifications arrive.
A complementary finding by Betsy Sparrow, Jenny Liu, and Daniel Wegner at Columbia University, published in Science in 2011, found that when people expect to have future access to information online, they form weaker memories of the information itself but stronger memories of where to find it. This "Google Effect" suggests that constant access to search may be reshaping how human memory operates - with external information stores substituting for internal memory in ways that reduce the depth of cognitive processing.
| Attention Cost | Mechanism | Evidence |
|---|---|---|
| Reduced sustained focus | Habitual context-switching from notifications | Mark (2023) - average focus fell from 2.5 min to 47 sec |
| Impaired episodic memory | Reduced presence/attention during experiences | Henkel (2014) - photo-taking effect |
| Increased anxiety | Comparison, FOMO, uncertain reward cycles | Multiple correlational studies |
| Sleep disruption | Blue light, psychological arousal, late-night use | Several experimental studies |
| Reduced reading depth | Skimming habits transferred from web reading | Mangen, Walgermo & Bronnick (2013) |
| Outsourced memory | Over-reliance on search reduces encoding | Sparrow, Liu & Wegner (2011) |
Political Fragmentation
The attention economy's effects on political discourse operate through several mechanisms. Outrage-optimized amplification surfaces divisive content. Algorithmic personalization creates filter bubbles - a term coined by activist Eli Pariser in his 2011 book of the same name - in which users primarily encounter content aligned with their existing views. The emotional tenor of political content shifts toward anger and contempt.
Whether social media causes political polarization or merely reflects it is a question researchers continue to debate. A major 2023 study by Levi Boxell, Matthew Gentzkow, and Jesse Shapiro found that polarization in the United States has increased most among demographic groups with the lowest social media use - older Americans - which complicates simple causal narratives. What is clearer is that the business model of attention platforms is structurally aligned with the production of engagement - and that political outrage is among the highest-engagement content categories - creating ongoing pressure toward more divisive discourse regardless of any platform's stated values.
A 2020 study by Chris Bail and colleagues at Duke University, published in Proceedings of the National Academy of Sciences, used a field experiment to test whether social media filter bubbles increase polarization. Participants were randomly assigned to follow a bot that exposed them to opposing political content for one month. Counterintuitively, Republicans who were exposed to liberal content became significantly more conservative. The study suggested that cross-cutting exposure, without the right contextual framing, can backfire - that the attention economy's confrontational content framing makes encountering opposing views feel like an attack rather than an opportunity for understanding.
What Surveillance Capitalism Is
The attention economy is related to but distinct from what Shoshana Zuboff, Harvard Business School professor emerita, calls surveillance capitalism in her foundational 2019 book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.
Zuboff's argument is that the most advanced platforms do not merely sell access to attention but have developed the capacity to shape behavior through what she calls "instrumentation power" - the ability to know what you will do before you do it and to adjust the informational environment to steer you toward outcomes that benefit the platform's clients.
The key distinction from conventional advertising is behavioral prediction and modification at scale. Traditional advertising influences behavior by providing information or emotional association. Surveillance capitalism, Zuboff argues, influences behavior by modeling individual psychology with sufficient precision to predict and preempt choices - and then deploying "reality mining" to modify the physical and digital environment in ways that steer those choices.
This goes beyond advertising into territory that has few legal or ethical frameworks to govern it. The behavioral data extracted from attention platforms is used not just to show relevant ads but to build behavioral prediction models that are sold to third parties - insurance companies, financial institutions, employers, political campaigns - who use them for decisions with significant consequences for the people being modeled.
"Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as 'machine intelligence,' and fabricated into prediction products that anticipate what you will do now, soon, and later." - Shoshana Zuboff, The Age of Surveillance Capitalism (2019)
The Regulation Landscape
The attention economy has prompted regulatory responses in several jurisdictions, though comprehensive frameworks remain elusive.
The European Union's Digital Services Act (DSA), which came into full effect in 2024, requires very large online platforms to assess and mitigate systemic risks from their algorithms, including risks to mental health, civic discourse, and electoral integrity. It mandates external audits of algorithmic systems and requires platforms to offer users chronological feed options. The DSA represents the most ambitious regulatory attempt to date to address the structural incentive problems of the attention economy.
The UK's Online Safety Act (2023) requires platforms to conduct child safety risk assessments and implement age-appropriate protections, with specific requirements around features designed to maximize engagement among younger users. The Act has been controversial for its scope and potential effects on encryption and privacy.
In the United States, legislative progress has been slower. The Kids Online Safety Act (KOSA) has been debated in Congress since 2022 without passing in its original form, reflecting tensions between child safety objectives, free expression concerns, and tech industry lobbying. Several states have passed their own regulations, creating a patchwork that the industry has challenged in court.
The fundamental regulatory challenge is that engagement optimization is not a specific feature that can be banned - it is the entire business logic of advertising-supported platforms. Requiring platforms not to maximize engagement would be equivalent to requiring them to abandon their revenue model.
What Individuals Can Do
The attention economy operates at a structural level that individuals cannot fully escape, but there are practical steps that shift the balance from reactive capture to intentional use.
Reduce the signal surface:
- Turn off all non-essential notifications - alerts that do not require immediate action
- Move social media apps off the home screen (friction reduces impulsive opening)
- Use app timers or scheduled access rather than unlimited availability
Replace algorithmic curation with deliberate curation:
- Subscribe to newsletters, RSS feeds, and podcasts rather than relying on social feeds
- Follow specific people whose content you actively want rather than browsing algorithmic feeds
- Use chronological feeds where available rather than engagement-ranked feeds
Manage attention actively:
- Designate phone-free times (meals, first hour of day, final hour before sleep)
- Keep devices out of the bedroom
- Practice single-tasking: finish one thing before switching
Make usage visible:
- Use screen time tracking to see actual usage versus perceived usage (most people significantly underestimate their phone use)
- Audit which apps consume the most time and whether that time reflects your stated priorities
Reclaim cognitive deep work:
- Schedule blocks of protected, single-task time for cognitively demanding work
- Treat distraction-free focus as a productive practice that requires deliberate cultivation
- Cal Newport's concept of "deep work", developed in his 2016 book of the same name, provides a practical framework: the ability to focus without distraction on cognitively demanding tasks is becoming simultaneously rarer and more valuable as the attention economy makes it harder to sustain
The goal is not technology abstinence, which is neither realistic nor necessary. The goal is to shift from being a passive subject of algorithmic curation - having your attention captured and directed by systems optimized for engagement metrics - to being a deliberate consumer who makes active choices about what to pay attention to and why.
This shift is harder than it sounds, because the systems arrayed on the other side of it are backed by hundreds of billions of dollars and decades of behavioral research. But the fundamental reorientation - from reactive to deliberate, from algorithmically served to actively chosen - is both possible and consequential.
The attention economy is, ultimately, a market for a resource that was never previously commodified at scale: the finite moments of human cognitive engagement. Understanding that your attention is the product - not you, not your data in the abstract, but the specific, irreplaceable hours of your conscious life that you spend looking at a screen - is the first step toward engaging with these systems on something closer to equal terms.
Frequently Asked Questions
What is the attention economy?
The attention economy is the conceptual framework describing the competition for human attention as the primary form of economic competition in media and digital environments. Coined from Herbert Simon's 1971 observation that 'a wealth of information creates a poverty of attention,' the concept recognizes that when information is abundant, attention becomes scarce — and whoever controls attention controls economic and political power.
How do platforms monetize attention?
Digital platforms earn revenue primarily through advertising, and advertising revenue is proportional to the time users spend on the platform and the behavioral data generated during that time. Every minute of attention is packaged and auctioned to advertisers. This creates a structural incentive to maximize engagement through any means available — which includes content that triggers emotional arousal, outrage, or fear, because such content generates more engagement than content that is calm or neutral.
What is a variable reward loop?
A variable reward loop is a behavioral mechanism drawn from B.F. Skinner's operant conditioning research, in which rewards are delivered unpredictably rather than on a fixed schedule. Variable reinforcement produces stronger and more persistent behavior than predictable reinforcement — a well-documented finding that slot machine designers and social media engineers have both exploited. The unpredictable delivery of likes, notifications, and interesting content creates a pull-to-refresh compulsion that is difficult to resist.
What are the costs of the attention economy?
Documented costs include impaired sustained attention capacity, increased anxiety and depression especially among adolescents, degraded episodic memory for daily experience, and political fragmentation driven by outrage-optimized content amplification. Research by Jean Twenge and Jonathan Haidt links the rise of smartphone social media use, beginning around 2012, with measurable increases in adolescent mental health problems across multiple Western countries.
What can individuals do about the attention economy?
Practical responses include disabling non-essential notifications, using chronological rather than algorithmic feeds where available, designating phone-free periods and spaces, replacing passive scrolling with active content choices (subscriptions, RSS, deliberate searches), and using screen time tracking to make actual usage visible. The goal is not technology abstinence but intentional use: shifting from reactive consumption driven by algorithmic curation to deliberate choices aligned with personal priorities.
