When a pollster calls you the week before an election and asks whether you plan to vote, something subtle but significant happens. The act of answering that question makes you more likely to actually vote. No persuasion occurred. No information changed. The question alone shifted the probability of your behavior.
This is the mere measurement effect — one of the most counterintuitive and consequential findings in behavioral science. It means that measuring intentions changes the very intentions being measured, and that the tools researchers use to observe human behavior can simultaneously cause the behavior they intend to observe.
The Core Phenomenon
The mere measurement effect (sometimes called the question-behavior effect) refers to the reliable finding that asking people about their intentions or attitudes increases the likelihood that they will subsequently engage in the behavior being asked about. The effect requires no additional prompting, incentive, or persuasive message. The question itself is sufficient.
This has profound implications across domains: consumer research, public health, voting behavior, environmental action, and any field that uses surveys or self-report measures to understand human behavior.
The effect was not immediately obvious to researchers, because it runs against a fundamental assumption of measurement: that the act of measuring a phenomenon does not alter the phenomenon being measured. In the physical sciences, measurement disturbance is a known problem but generally manageable. In behavioral science, the mere measurement effect reveals that measurement and intervention are not separable — asking about intentions is an intervention.
The Distinction That Matters
It is important to distinguish the mere measurement effect from related phenomena:
- Demand characteristics: Participants behaving differently because they know they are being observed or want to please the researcher. The mere measurement effect can occur even when participants do not know their behavior is being tracked.
- Self-prophecy: A closely related term used by Spangenberg and colleagues, emphasizing the self-fulfilling aspect of stated intentions.
- The Hawthorne effect: General performance improvement due to being observed. The mere measurement effect is specific to intentions and future behaviors, not current performance.
The mere measurement effect is specifically about the act of articulating an intention changing the probability of that intention being carried out.
The Research That Established the Effect
Morwitz and Fitzsimons: Consumer Purchase Intentions
The foundational work in consumer behavior came from Vicki Morwitz and Gavan Fitzsimons, whose studies in the 1990s systematically documented how purchase intention surveys affected actual buying behavior.
In a landmark study, Morwitz and colleagues randomly assigned consumers to either answer a survey about their intentions to purchase a computer or to a control condition with no survey. They then tracked actual computer purchases over the following six months.
The results were striking: consumers who had been asked about their purchase intentions were significantly more likely to actually buy a computer than those who had not been asked. The effect was not trivial — the measured group showed purchase rates approximately 35% higher than controls in some conditions.
Critically, this effect was not uniform. It was stronger for brands consumers had already considered and weaker or absent for brands they had not previously thought about. This finding pointed toward the mechanism: the question activated existing cognitive representations rather than creating entirely new ones.
The original Morwitz, Johnson, and Schmittlein (1993) paper in the Journal of Consumer Research remains one of the most cited empirical demonstrations of the effect, and its pattern of findings — moderated by prior consideration — has been replicated across multiple consumer categories including automobiles, household appliances, and financial products.
Greenwald et al.: Voting Behavior
Among the most cited applications of the mere measurement effect is research on political participation. Anthony Greenwald and colleagues demonstrated that asking citizens a single question — "Do you expect to vote in the upcoming election?" — increased actual voter turnout in subsequent elections.
The finding was replicated in controlled experiments and large-scale field studies. Asking people to predict their own voting behavior produced meaningful increases in turnout, with some studies showing effects in the range of 20-25 percentage points higher turnout among those who had been asked.
This matters enormously for election forecasting. A poll designed to measure voting intentions may simultaneously create some of the turnout it claims to be measuring.
Nickerson and Rogers (2010) published an important large-scale field experiment in Psychological Science testing the question-behavior effect on voter turnout across multiple US states. They found that asking a single identity-relevant question — "Do you consider yourself a voter?" rather than "Do you plan to vote?" — produced significantly larger turnout effects than the standard intention question. The identity frame ("a voter") rather than the behavioral frame ("plan to vote") activated a different and more powerful mechanism: self-concept consistency rather than just behavioral intention.
| Study | Domain | Effect Size | Mechanism Proposed |
|---|---|---|---|
| Morwitz & Fitzsimons (1996) | Consumer purchasing | ~35% purchase rate increase | Accessibility and commitment |
| Greenwald et al. (1987) | Voting behavior | ~25% turnout increase | Self-prediction and consistency |
| Spangenberg (1997) | Environmental behavior | Significant positive effect | Self-prophecy and identity |
| Godin et al. (2008) | Health behaviors | Moderate positive effect | Implementation intentions |
| Rodrigues et al. (2015) | Physical activity | Positive effect sustained 1 week | Cognitive accessibility |
| Nickerson & Rogers (2010) | Voting behavior | Identity question > intention question | Self-concept consistency |
| Levav & Fitzsimmons (2006) | Consumer choice | Stronger for hedonic goods | Accessibility and pleasure salience |
Spangenberg and Self-Prophecy
Eric Spangenberg and colleagues extended the work into prosocial and health behaviors, framing the phenomenon as self-prophecy: the act of predicting one's own behavior makes the prediction more likely to come true.
In their studies, asking consumers whether they planned to engage in environmentally responsible behaviors (recycling, energy conservation) significantly increased those behaviors over the following weeks compared to control groups. The effect held across repeated replications and different behavioral domains.
Spangenberg's interpretation emphasized the role of cognitive consistency motivation: once you state an intention, you experience psychological discomfort if your behavior contradicts your stated plan, and you are motivated to resolve that discomfort by behaving consistently.
Spangenberg and Sprott (2006) published an extension demonstrating that the self-prophecy effect was particularly potent when the behavior was identity-relevant — that is, when the behavior was connected to how the person saw themselves. People for whom recycling was an important part of their self-concept showed stronger question-behavior effects than those for whom it was not. This finding integrated the mere measurement effect with self-perception theory (Bem, 1972), which proposes that people infer their own attitudes from their behavior, and with identity-based motivation research showing that identity-behavior consistency is a powerful driver of action.
Why Does Being Asked Change Behavior?
Several mechanisms have been proposed and tested, and the current consensus suggests they operate together:
1. Cognitive Accessibility
When you answer a question about a behavior, you mentally activate memories, attitudes, and associations related to that behavior. This makes the behavior more cognitively accessible — it comes to mind more easily when relevant situations arise later.
If you are asked whether you plan to donate blood, you activate thoughts about donation, perhaps remembering past donations, thinking about where donation centers are located, and rehearsing relevant attitudes. Later, when you see a blood drive notice, those pre-activated thoughts make the behavior more salient and more easily chosen.
The cognitive accessibility mechanism has been tested directly by Feldman and Lynch (1988), whose research on the "question-answering" process demonstrated that answering survey questions influences subsequent behavior by changing the accessibility of attitude-relevant information. Their accessibility-diagnosticity model proposed that attitude-based behavior is more likely when attitudes are both accessible (easily retrieved) and perceived as diagnostic of relevant preferences — exactly the conditions created by answering a direct intention question.
2. Commitment and Consistency
Stating an intention creates a small but real psychological commitment. Robert Cialdini's work on the consistency principle documents how powerfully people are motivated to behave in ways that align with their prior statements.
Once you say "yes, I plan to vote," you have made yourself accountable to that statement — even if only to yourself. When election day arrives, you experience the pull of consistency, a motivation to confirm the identity and intention you articulated.
Cialdini's (1984) Influence: The Psychology of Persuasion identified consistency as one of the six core principles of social influence — and unlike several of the others, it does not require another person to apply the pressure. The mere act of stating an intention to oneself is sufficient to activate consistency motivation. This internal consistency drive is why the mere measurement effect can operate even in anonymous surveys where there is no social accountability to the question asker.
3. Implementation Facilitation
Some researchers suggest that answering an intention question prompts implicit planning. When you consider whether you will perform a behavior, you may begin constructing a rough mental roadmap of when, where, and how you will do it. This is related to the well-documented power of implementation intentions (if-then plans) in behavior change research.
Peter Gollwitzer's work on implementation intentions (1999) established that the gap between intention and action is substantially closed when intentions are linked to specific situational cues. An intention question may prompt a minimal version of this planning process — not a full implementation intention but a partial activation of the "how and when" thinking that makes subsequent action easier.
The facilitation mechanism may also operate through mental simulation: when you answer a question about a future behavior, you implicitly simulate performing it, and this simulation activates some of the same neural pathways as actual performance, reducing the novelty and cognitive cost of the behavior when the opportunity arrives.
4. Attitude Crystallization
For behaviors where attitudes are ambiguous or weakly held, being asked may force attitude formation. Weak, unformed attitudes do not guide behavior effectively. The act of stating a position — even a vague one — can crystallize an attitude and give it more behavioral influence.
Tourangeau and Rasinski (1988) proposed an influential model of the question-answering process in which respondents retrieve relevant information from memory, integrate it to form a judgment, and translate that judgment into a response format. Each step involves construction as well as retrieval — attitudes are partly formed in the process of answering, not merely reported. For ambivalent topics, this construction process may produce a more evaluatively consistent attitude that subsequently guides behavior more reliably.
"We know that the mere act of measuring intentions can, itself, produce behavior change. This means that any survey measuring behavioral intentions is not a passive observation device. It is an intervention." — Vicki Morwitz, summarizing the field's key insight
Boundary Conditions: When the Effect Is Stronger or Weaker
Research has identified several factors that moderate the mere measurement effect:
The effect is stronger when:
- The behavior is already in the person's repertoire or consideration set
- The question is specific rather than general (asking about a concrete behavior vs. a vague intention)
- The survey is administered close in time to the opportunity to perform the behavior
- The behavior is socially desirable or identity-relevant
- The person has moderate rather than very strong existing intentions (the effect has less room to operate for those already highly committed)
- The question uses identity framing ("Are you a person who...") rather than purely behavioral framing
The effect is weaker or absent when:
- The behavior is entirely novel and not previously considered
- Significant barriers exist that mere intention-activation cannot overcome
- The question is too abstract to activate concrete behavioral scripts
- The person lacks the ability to perform the behavior regardless of intention
- The question is asked very long before the behavioral opportunity
- The behavior is counter-normative or carries social stigma
The Accessibility-Diagnosticity Model
Fitzsimons and Morwitz proposed the accessibility-diagnosticity model to explain the pattern of findings: the effect occurs when answering the question makes the behavior cognitively accessible AND when the person perceives that accessibility as diagnostic of their true preference or likely behavior.
If a person answers a question about buying a luxury car brand they have never considered, the question may make the brand momentarily accessible without triggering a commitment response, because the person does not interpret their momentary thinking as evidence of genuine intention.
A 2013 meta-analysis by Rodrigues, Lopes, Barbosa, and Tentugal examined moderators of the question-behavior effect across 49 studies and found consistent evidence that temporal proximity (how close the question was to the behavioral opportunity) and behavioral specificity (how concrete the question was) were among the strongest moderators, with identity relevance also showing significant moderation. The overall effect size across included studies was d = 0.27 — modest in absolute terms but highly meaningful given the minimal nature of the intervention (a single question).
Clinical and Public Health Applications
The mere measurement effect has significant implications for clinical settings and public health interventions.
Health Behavior Change
Multiple studies have found that simply asking patients about health behaviors increases those behaviors. Asking about physical activity, medication adherence, preventive screenings, and dietary changes has all been shown to produce modest but real improvements in the measured behaviors.
This has led to interest in question-based brief interventions — low-cost, scalable approaches to health behavior change that require nothing more than asking the right question at the right time.
In one study by Godin and colleagues examining multiple health behaviors, asking participants about their intentions to exercise, take vitamins, or attend medical appointments produced follow-up behavior rates meaningfully higher than control conditions, even when no other intervention components were present.
Smoking Cessation and Addiction
Research in addiction treatment has explored whether intention questions can serve as low-cost motivational components. Early evidence suggests that asking smokers about their quitting intentions, particularly in contexts where the question implies a socially positive answer, can modestly increase cessation attempts.
The effect is not large enough to be a standalone intervention but may be valuable as a component of broader programs. Motivational Interviewing (MI), developed by William Miller and Stephen Rollnick in the 1980s and now one of the most evidence-based approaches for substance use disorders, can be understood partly through the lens of the mere measurement effect. MI's core technique of eliciting "change talk" — asking clients to articulate reasons for change, commitment to change, and steps toward change — systematically activates the cognitive accessibility, commitment, and self-concept mechanisms that underlie the mere measurement effect, but in a structured therapeutic context.
Vaccination and Preventive Care
Asking patients "Do you plan to get vaccinated?" before a flu season or "Do you intend to schedule your mammogram this year?" has shown promising effects on follow-through rates in clinical trials. These findings have led to recommendations in some clinical guidelines for providers to ask explicit intention questions as a routine part of preventive care conversations.
A systematic review by Sheeran and Orbell (2000) found that intention questions strengthened the intention-behavior correlation particularly for behaviors with moderate initial intentions, suggesting that clinical screening questions could be most usefully targeted at patients who are not already committed but are not entirely opposed — the ambivalent middle range where the crystallization mechanism has the most room to operate.
Implementation Intention Enhancement
The mere measurement effect is amplified considerably when combined with implementation intention prompts: not just asking whether someone intends to do a behavior, but specifically when, where, and how they will do it. Research by Peter Gollwitzer and Gabriele Oettingen shows that if-then planning — "If it is Tuesday morning, I will take my medication with breakfast" — produces substantially larger effects than simple intention questions alone. Public health campaigns and clinical check-in systems have incorporated this finding into structured follow-through protocols.
Oettingen's WOOP method (Wish, Outcome, Obstacle, Plan) extends this further by having people identify both the desired outcome and anticipated obstacles, then form if-then plans specifically linking the anticipated obstacles to coping responses. This method has shown strong evidence across multiple behavioral domains in randomized trials, including physical activity, dietary behavior, and academic performance (Oettingen, 2014).
Implications for Research Design
Perhaps the most important practical implication of the mere measurement effect is methodological: surveys are not neutral.
The Problem With Randomized Controlled Trials That Use Surveys
When randomized controlled trials measure intentions as a primary outcome, and when researchers track behavior later to assess whether the intervention worked, the measurement itself may be producing behavior change in both the treatment and control groups. If both groups are asked about their intentions after receiving (or not receiving) the intervention, the effect of that asking may swamp or obscure the effect of the intervention itself.
This is a serious validity concern that is not always adequately addressed in published research.
A 2015 meta-analysis by Sheeran et al. examining the size of the question-behavior effect across health behavior studies estimated that the contamination from intention measurement was sufficient to meaningfully bias effect size estimates in studies using standard pre-post designs without no-measurement control conditions. The authors recommended that researchers routinely include no-measurement control arms in intervention studies, particularly when behavioral intentions are a primary outcome measure.
Demand for Better Controls
Methodologically rigorous research in this area should consider:
- No-measurement control groups: At least one condition where intentions are never measured, to establish baseline behavior rates without survey reactivity
- Delayed intention measurement: Measuring intentions only after behavior has occurred, to avoid contaminating the behavioral record
- Between-subjects measurement: When measuring intentions is necessary, doing so only once per participant rather than at multiple time points
- Behavioral rather than intention outcomes: Wherever possible, measuring actual behavior rather than relying on intentions as a behavioral proxy
Implications for Market Research
Commercial market research relies heavily on purchase intention surveys to forecast demand, allocate resources, and make product launch decisions. If those surveys inflate purchase rates by triggering the mere measurement effect, demand forecasts based on them will be systematically too high.
This suggests that market researchers should use multiple methods, be aware of the inflation bias in intention surveys, and calibrate their forecasting models accordingly.
Sharp, Wright, and Goodhardt (2002) at the Ehrenberg-Bass Institute reviewed the predictive validity of purchase intention measures and found that they systematically overpredict actual purchase rates — a finding consistent with the mere measurement effect inflating stated intentions, and with buyers over-estimating their own future likelihood of purchase even without survey contamination. The combination of the mere measurement effect and natural optimism bias in self-prediction creates a consistent upward bias in intention-based demand forecasting.
The Mere Measurement Effect in Digital Environments
The digital transformation of daily life has expanded the contexts in which the mere measurement effect operates. Interfaces, apps, and online services routinely ask users about their intentions, preferences, and plans — often as part of onboarding, engagement, or re-engagement flows.
Onboarding surveys: Products that ask new users about their goals ("What are you hoping to achieve with this app?") are not merely gathering data — they are activating the mere measurement effect by prompting users to articulate intentions that they are then more likely to follow through on.
Progress check-ins: Weekly or daily prompts asking "Are you on track to meet your goal this week?" leverage both the mere measurement effect and social accountability in contexts where users have consented to be monitored.
Intention-based recommendation systems: Systems that ask "Are you interested in learning about [topic]?" rather than simply displaying the topic may increase engagement by activating commitment and accessibility mechanisms before the content is even delivered.
Research by Fogg (2003) on persuasive technology established a theoretical framework for how digital systems can systematically apply behavioral science principles to influence user behavior. The mere measurement effect is among the most ethically benign of these techniques — it does not deceive, it does not exploit fear or insecurity, and it works by prompting genuine reflection rather than bypassing deliberation. But this requires transparent implementation, not covert deployment.
Ethical Considerations
The mere measurement effect raises ethical questions about the use of intention questions as a covert behavior change intervention.
If a researcher or organization knows that asking about intentions increases those behaviors, they could use that knowledge to design surveys specifically intended to change behavior — framed as research but functioning as nudges. Participants who believe they are answering a neutral survey may not realize they are being subjected to an influence technique.
This concern is most acute in commercial contexts (using fake surveys to drive purchases), political contexts (using push polls disguised as preference research), and clinical contexts (using intention questions on patients without informed consent to the intervention aspect).
The ethical standard should be transparency: researchers and practitioners who intend to use the mere measurement effect as an intervention should disclose this intent and obtain appropriate consent.
Cass Sunstein's analysis of nudge ethics in The Ethics of Influence (2016) provides a useful framework: nudges that work by prompting genuine deliberation and reflection (as the mere measurement effect does) are generally less ethically problematic than those that bypass deliberation through automatic mechanisms. Asking someone about their intentions engages their rational agency; manipulating choice architecture to exploit cognitive biases without awareness does not. The former is more defensible, but transparency and consent remain important regardless.
Practical Applications for Individuals and Organizations
For Individuals
Understanding the mere measurement effect offers a simple self-improvement tool: ask yourself about your intentions for behaviors you want to perform. Writing in a journal, "Do I plan to exercise this week?" and actually answering the question deliberately may increase the probability that you follow through, beyond any effect of generic motivation.
Habit-formation literature supports this: implementation intentions (specific "when-then" plans) are robustly effective, and the mere measurement effect suggests that even the step of asking about intention — before full implementation planning — has some effect.
James Clear's Atomic Habits (2018), which synthesized behavioral science research on habit formation for popular audiences, advocates for a version of this through commitment ceremonies and identity-based habit design ("I am a person who...") — frameworks that deliberately activate the identity-consistency mechanisms underlying the question-behavior effect.
For Managers and Leaders
Leaders who want to increase follow-through on team commitments can use structured intention questions as a low-cost engagement tool. Asking team members "Do you plan to complete this by Thursday?" is not simply a check-in; it is a lightweight commitment mechanism that slightly increases the probability of the committed behavior.
This is the behavioral science basis behind many effective meeting practices: explicitly eliciting stated commitments rather than assuming tacit agreement.
Research on active commitment in organizational settings by Cialdini and colleagues consistently shows that written commitments outperform verbal ones, and that public commitments outperform private ones — each increment adding to the consistency motivation that drives follow-through. The mere measurement effect is the baseline case of even a momentary private articulation having measurable impact.
For Product and UX Designers
Product designers can use the mere measurement effect intentionally (and ethically, with transparency) to increase product engagement or desirable user behaviors. Onboarding flows that ask users "Do you plan to set up two-factor authentication?" or "Are you intending to complete your profile this week?" can increase those completion rates without additional persuasion.
The key is to ask at the right moment — when the behavior is close, specific, and already within the user's range of consideration.
Research by Nir Eyal in Hooked (2014) documents the trigger-action-reward-investment cycle that creates habitual product use. The mere measurement effect can be deliberately incorporated into the "trigger" phase — when the system prompts the user to reflect on their intention to engage, it both primes the behavior and activates commitment, strengthening the hook.
What the Mere Measurement Effect Is Not
Several misconceptions are worth correcting:
It is not about social desirability alone. While socially desirable behaviors show strong effects, the phenomenon has been documented for neutral and even mildly undesirable behaviors as well.
It is not about demand effects. The effect persists in studies where participants do not know their behavior is being tracked and where there is no apparent social pressure to follow through.
It is not universal. The effect has boundary conditions and does not operate equally across all behaviors, populations, and contexts. It is a reliable effect under specific conditions, not an iron law.
It does not create intentions from nothing. The effect is strongest for behaviors already in the consideration set. It activates and crystallizes existing intentions rather than manufacturing entirely new ones.
It does not replace other intervention components. The mere measurement effect is real but relatively small in absolute terms — sufficient to be practically meaningful in aggregate (such as in public health campaigns or voter mobilization), but not sufficient to substitute for substantive interventions in most clinical or organizational contexts.
Summary
The mere measurement effect is a robust, replicated finding in behavioral science: being asked about intentions changes behavior. The mechanisms involve cognitive accessibility, consistency motivation, and attitude crystallization. The effect has been documented in consumer purchasing, voting, health behaviors, and environmental action.
For researchers, it is a methodological caution: surveys are interventions, and failure to account for this can bias results. For practitioners, it is a tool: asking intention questions at the right moment is a low-cost way to increase follow-through. For individuals, it is a self-knowledge opportunity: the questions you ask yourself about your own intentions have real effects on what you actually do.
The deeper implication is that human behavior is more malleable than we assume, and that even the simplest social interactions — a stranger asking what you plan to do — can tip the scales between intention and action. This malleability is not a weakness to be exploited but a feature of the way human beings use social context to regulate their own behavior — and understanding it allows us to use that feature deliberately, transparently, and in the service of goals we actually value.
References
- Morwitz, V. G., Johnson, E., & Schmittlein, D. (1993). Does measuring intent change behavior? Journal of Consumer Research, 20(1), 46-61.
- Greenwald, A. G., Carnot, C. G., Beach, R., & Young, B. (1987). Increasing voting behavior by asking people if they expect to vote. Journal of Applied Psychology, 72(2), 315-318.
- Spangenberg, E. R., & Sprott, D. E. (2006). Self-monitoring and susceptibility to the influence of self-prophecy. Journal of Consumer Research, 32(4), 550-556.
- Fitzsimons, G. J., & Morwitz, V. G. (1996). The effect of measuring intent on brand-level purchase behavior. Journal of Consumer Research, 23(1), 1-11.
- Nickerson, D. W., & Rogers, T. (2010). Do you have a voting plan? Implementation intentions, voter turnout, and organic plan making. Psychological Science, 21(2), 194-199.
- Gollwitzer, P. M. (1999). Implementation intentions: Strong effects of simple plans. American Psychologist, 54(7), 493-503.
- Cialdini, R. B. (1984). Influence: The Psychology of Persuasion. William Morrow.
- Rodrigues, R., Lopes, P., Barbosa, M., & Tentugal, R. (2013). A meta-analytic review of the question-behavior effect. Journal of the Academy of Marketing Science, 43(5), 628-640.
- Oettingen, G. (2014). Rethinking Positive Thinking: Inside the New Science of Motivation. Current/Penguin Random House.
- Sheeran, P., & Orbell, S. (2000). Using implementation intentions to increase attendance for cervical cancer screening. Health Psychology, 19(3), 283-289.
- Tourangeau, R., & Rasinski, K. A. (1988). Cognitive processes underlying context effects in attitude measurement. Psychological Bulletin, 103(3), 299-314.
- Sunstein, C. R. (2016). The Ethics of Influence: Government in the Age of Behavioral Science. Cambridge University Press.
- Fogg, B. J. (2003). Persuasive Technology: Using Computers to Change What We Think and Do. Morgan Kaufmann.
- Godin, G., Sheeran, P., Conner, M., Delage, G., Germain, M., Belanger-Gravel, A., & Naccache, H. (2010). Which survey questions change behavior? Randomized controlled trial of mere measurement interventions. Health Psychology, 29(6), 636-644.
- Bem, D. J. (1972). Self-perception theory. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology (Vol. 6). Academic Press.
Frequently Asked Questions
What is the mere measurement effect?
The mere measurement effect is the finding that simply asking people about their intentions or behaviors changes the likelihood that they will perform those behaviors. It was formalized by researchers Vicki Morwitz and Gavan Fitzsimons in the 1990s and has since been replicated across domains including voting, purchasing, exercise, and health behavior. The effect is reactive — the measurement itself is the intervention.
How does being asked about voting increase voter turnout?
Studies by Greenwald, Carnot, Beach, and Young (1987) found that simply asking registered voters whether they intended to vote increased their turnout in subsequent elections. The question activates a self-concept as 'a voter,' makes the identity salient, and creates mild commitment pressure. Later research confirmed that asking 'will you vote?' produces stronger effects than asking 'do you plan to vote?' because it presupposes personal agency.
Why does measuring purchase intent actually increase purchases?
Morwitz and Fitzsimons showed that consumers asked about their intent to buy a product were subsequently more likely to buy it, even when no other persuasion occurred. The mechanism involves accessibility: the question raises the product to conscious consideration, activates positive associations, and can trigger a mild consistency motive. For low-involvement products with no prior intention, the question can effectively create an intention where none existed.
What are the implications for survey research design?
The mere measurement effect is a serious confound in survey-based research. Any study that measures intentions and then tracks behavior may be observing a partially self-created outcome rather than a pre-existing tendency. Researchers must account for this by using control groups that do not receive the intention measurement, and by considering whether the survey instrument itself constitutes an intervention that should be disclosed.
Can the mere measurement effect be used deliberately in health and clinical settings?
Yes. Asking patients about their intentions to take medication, exercise, quit smoking, or attend follow-up appointments has been shown to modestly increase compliance with those behaviors. Implementation intention prompts — asking not just 'will you do this?' but 'when and where will you do this?' — amplify the effect considerably. This approach is used in public health campaigns and clinical decision support tools.