There is a peculiar irony at the heart of expertise. The more deeply you understand something, the harder it often becomes to explain it. Brilliant scientists give confusing lectures. Experienced engineers write documentation that only other engineers can follow. Veteran teachers watch students glaze over during explanations that seem crystal clear from the front of the room.
This is not laziness or arrogance. It is a predictable cognitive bias called the curse of knowledge -- and understanding it is one of the most practically useful things you can learn about how the mind works.
What Is the Curse of Knowledge?
The curse of knowledge is a cognitive bias in which a person who has mastered a subject finds it genuinely difficult to imagine what it is like not to know that subject. Once your brain has fully integrated a concept, it loses easy access to the earlier state of not knowing. The knowledge becomes invisible to you -- as automatic and unconscious as knowing that red means stop.
The term was coined in a 1989 paper by economists Colin Camerer, George Loewenstein, and Martin Weber, published in the Journal of Political Economy. Their original context was financial markets: they found that better-informed traders systematically overestimated how much other traders knew, leading to predictable market inefficiencies. Better-informed traders who knew the final price of a commodity could not easily reconstruct how uncertain that price had seemed before the information was revealed.
The paper's original framing was economic, but the concept quickly spread into education, communication, and organizational behavior, where its effects are even more visible. In 2007, Chip and Dan Heath popularized it for a general audience in Made to Stick, and it became one of the most widely cited concepts in science communication, user experience design, and pedagogical research.
"The problem is not that experts don't know enough. The problem is that they know too much -- and they can't remember what it felt like to know nothing."
This is not a mild inconvenience. It fundamentally shapes how knowledge gets transmitted between people, how products get designed, how policies get written, and how organizations communicate with customers.
The Tapping Experiment: A Perfect Demonstration
No illustration captures the curse of knowledge more vividly than the tapping experiment, made famous by Chip and Dan Heath in their 2007 book Made to Stick.
The experiment was originally conducted by Stanford graduate student Elizabeth Newton for her doctoral dissertation in psychology. She divided participants into two groups: tappers and listeners. Tappers were given a list of 25 well-known songs -- "Happy Birthday," "The Star-Spangled Banner," "Jingle Bells" -- and asked to tap out the rhythm on a table. Listeners had to identify the song from the taps alone.
Before each session, Newton asked the tappers to predict how many songs the listener would correctly identify.
The results were striking.
| Metric | Result |
|---|---|
| Tappers' prediction of listener accuracy | ~50% of songs identified |
| Actual listener accuracy | 2.5% of songs identified |
| Songs tapped total (over the study) | 120 |
| Songs correctly identified by listeners | 3 |
Tappers were consistently baffled by the listeners' failures. From the tapper's perspective, the song was obvious -- they could hear every note, every word, every chorus in their heads while they tapped. The listener had no access to any of that internal music. All they heard was a series of irregular clicks.
That gap -- between the rich internal experience of the knower and the bare signal available to the learner -- is the curse of knowledge in action.
What makes this experiment particularly valuable as a demonstration is the emotional quality of the tapper's experience. Newton reported that tappers often expressed genuine frustration and disbelief at listeners' failures. They were not merely predicting incorrectly; they could not comprehend how the listener could fail to hear what was so obvious to them. This emotional dimension -- the impatience, the disbelief, the sense that the gap is the listener's failure rather than the tapper's -- is precisely what makes the curse of knowledge so damaging in real teaching, management, and communication contexts.
Why Experts Lose the Beginner's Perspective
Understanding why this happens requires a brief look at how the brain processes and stores learned information.
Automaticity and Chunking
When you first learn to drive, you have to consciously manage the clutch, the gear shift, the mirrors, the steering wheel, and the road simultaneously. It is exhausting. After years of practice, all of those actions collapse into a single fluid behavior. Cognitive scientists call this chunking -- the brain groups related pieces of information into a single unit that can be retrieved and executed as a whole.
Chunking is enormously efficient. It frees working memory for higher-order tasks. But it comes at a cost: once knowledge becomes chunked, the individual steps become inaccessible to conscious inspection. You can no longer easily explain why you check the mirror at precisely that moment or how you know the clutch is at the biting point -- you just do it.
The same process happens with conceptual knowledge. An experienced software engineer no longer consciously thinks through why a recursive function needs a base case. A trained economist does not think step by step through opportunity cost. The knowledge is there, but it has been compressed into an intuition that bypasses the explicit reasoning that a beginner needs to see.
Anders Ericsson's research on expert performance, which formed the empirical basis for discussions of deliberate practice, found that experts' knowledge is organized in large, hierarchically structured chunks that allow rapid pattern recognition. A chess grandmaster perceives meaningful board configurations where a novice sees only individual pieces. This perceptual and cognitive reorganization is exactly what makes expertise powerful -- and exactly what makes expertise hard to communicate. The grandmaster cannot easily reconstruct the novice's board, because they no longer see individual pieces.
The Fluency Illusion
A related phenomenon is fluency illusion: when you can recognize something quickly -- a formula, a word, a piece of code -- your brain interprets that ease of recognition as genuine understanding. This is why students who reread their notes feel prepared for exams but often perform poorly. Familiarity masquerades as comprehension.
For experts, the fluency illusion cuts even deeper. They encounter terms and concepts that they have read, used, and discussed thousands of times. That extreme fluency makes the concepts feel so obvious that it genuinely cannot compute that someone else might find them opaque.
Bjork and colleagues' research on "desirable difficulties" in learning -- the finding that learning is actually more durable when it is made harder -- is related: the conditions that make knowledge feel solidly held (fluency, familiarity, ease of recall) are often poor indicators of how deeply it is understood or how accessible it is to genuine explanation. Experts mistake their own fluency for transparency.
Simulation Failure
A third mechanism is what might be called simulation failure -- the difficulty of mentally simulating a knowledge state you no longer occupy. When you know something, your mental model of the world incorporates it. You cannot easily run a simulation of the world without that knowledge, because every aspect of your thinking now depends on it. It is like trying to imagine what a colour looks like before you first learned its name.
Research by Birch and Bloom (2007), published in Psychological Science, demonstrated this mechanism in children using what they called the "Curse of Knowledge in children." Children who had been shown the true contents of a box consistently overestimated how many other children would guess the correct contents. Even young children show the curse of knowledge -- and the mechanism appears to be exactly this failure to simulate the ignorant knowledge state.
The Expert Blind Spot in Teaching
Educational psychologist Pamela Grossman coined the term expert blind spot to describe specifically how domain expertise interferes with effective teaching. Her research, later extended by Nathan and Petrosino (2003) in a widely cited paper in the Educational Researcher, found that teachers with more subject-matter expertise sometimes performed worse on measures of pedagogical reasoning than those with less expertise.
This seems counterintuitive. But the explanation is consistent with everything we know about the curse of knowledge. Expert teachers are so far inside their subject that they have lost track of which parts are hard for beginners, which analogies are accessible, and which steps in a procedure can safely be skipped.
Nathan and Petrosino's research identified specific patterns in how expert blind spot manifests in the classroom:
- Skipping prerequisite steps: Experts assume background knowledge that students do not have
- Jargon without grounding: Technical terms are used before they are properly defined
- Abstract before concrete: Experts present formal definitions first and examples second, when learners need it the other way around
- Impatience with naive questions: Questions that seem foolish to experts are precisely the questions beginners most need to ask
There is also a striking counterintuitive finding: in some experiments, novice tutors outperformed expert tutors with struggling students. The novices remembered the confusion. They knew which parts were hard because those parts were recently hard for them.
A closely related body of research examines what Lee Shulman called pedagogical content knowledge -- the specialized knowledge of how to teach a specific subject, distinct from content knowledge itself. Shulman's (1986) landmark paper in the Educational Researcher argued that knowing a subject and knowing how to teach it are genuinely distinct competencies. Expert knowledge alone -- even deep expert knowledge -- does not automatically confer the ability to communicate that knowledge. The curse of knowledge is precisely why pedagogical content knowledge must be developed separately from content knowledge.
Research on engineering education has been particularly revealing. A study by Streveler and colleagues (2008) found that engineering students consistently struggled with a specific set of conceptual thresholds -- ideas that, once understood, reorganized the student's entire conceptual framework, but which were extremely difficult to pass through. Engineering professors, asked to predict which concepts students found hardest, showed poor calibration -- they had long since forgotten the confusion, and their difficulty rankings bore little relationship to actual student struggle data.
The Curse of Knowledge in Education: What the Numbers Show
The practical impact of the curse of knowledge in educational settings is substantial and measurable. Several lines of research converge on the same finding: instruction designed with expert knowledge as its baseline consistently underserves students whose baseline is radically different.
A meta-analysis by Hattie (2009) of over 800 meta-analyses examining influences on student achievement found that teacher clarity -- the degree to which explanations are comprehensible to students -- had an effect size of 0.75, placing it among the most influential instructional factors. Clarity is, in large part, an inverse measure of the curse of knowledge: clearer explanations are those where the teacher has more successfully modeled the student's knowledge state.
Research on the use of worked examples in mathematics education -- a teaching technique that makes expert reasoning explicitly visible rather than assuming students can infer it -- consistently shows strong effects on learning outcomes, particularly for novice learners. Sweller and Cooper's (1985) early research on worked examples demonstrated that students who studied worked-out problems outperformed students who practiced solving equivalent problems, because the worked examples reduced the cognitive load of figuring out the solution procedure while making that procedure explicit. The curse of knowledge pushes experts away from worked examples toward problem-solving -- exactly the instruction style that serves experts well but beginners poorly.
The implication is measurable: instruction designed to counteract the curse of knowledge produces meaningfully better outcomes. The question is not whether it matters but how to make it routine.
The Feynman Technique: Confronting the Curse
Physicist Richard Feynman won the Nobel Prize in Physics in 1965. He was also one of the most celebrated teachers in the history of science -- not despite his expertise, but because he developed deliberate strategies for breaking through his own curse of knowledge.
The method attributed to him, now commonly called the Feynman Technique, works in four steps:
- Choose a concept and write its name at the top of a blank page
- Explain it as if teaching a child -- using simple words, no jargon, concrete examples
- Identify the gaps -- wherever your explanation becomes vague, circular, or jargon-filled, you have found a gap in your own understanding
- Return to the source and study until you can fill the gap, then explain it simply again
What makes this method powerful is that it turns the curse of knowledge into a diagnostic tool. You are not asked to pretend you don't know things. You are asked to surface, through the act of attempted simple explanation, exactly which parts of your knowledge are genuine and which are the fluency illusion.
Feynman reportedly applied this to his own Nobel Prize-winning work on quantum electrodynamics. When he could not explain a concept to a first-year student, he took it as a signal that he did not fully understand it yet.
The Feynman Technique has also found empirical support in the broader literature on self-explanation. Research by Chi and colleagues (1994) showed that students who explained material to themselves during study -- making implicit reasoning explicit -- learned significantly more deeply and showed better transfer to new problems. The act of attempted explanation exposes gaps that passive reading or re-reading conceals. This is, at its core, the same mechanism as the Feynman Technique: using explanation as a diagnostic for understanding.
Curse of Knowledge in Writing
The curse of knowledge may be most damaging in written communication, because writing lacks the real-time feedback that makes conversation self-correcting. When you are talking to someone and they look confused, you can adjust. When you are writing, the reader's confusion is invisible to you.
Steven Pinker's book The Sense of Style (2014) dedicates significant attention to this problem. Pinker argues that most bad writing is not caused by poor grammar or weak vocabulary. It is caused by the writer's failure to simulate the reader's knowledge state. The writer knows what they mean. They assume the reader will too.
Pinker's prescription aligns with the Feynman approach: show your reasoning, don't assume it. Don't write "for obvious reasons" -- state the reasons. Don't use an acronym without defining it. Don't assume the reader shares your frame of reference.
Empirical research on document comprehension supports this. A meta-analysis by Schriver (1989) found that writers who tested their documents on representative readers -- and revised based on observed comprehension failures -- produced substantially more comprehensible documents than those who relied on expert peer review or stylistic revision. The curse of knowledge means that expert peer reviewers miss the same things the author misses; only readers who genuinely do not know the material reveal what is actually opaque.
Practical writing strategies for overcoming the curse of knowledge include:
- Use the "stranger test": Would a smart person with no background in this field understand this sentence?
- Read your work aloud: Awkward passages often signal places where you have assumed too much
- Get a naive reader: Have someone outside your field read a draft and mark everything they find confusing
- Concrete before abstract: Introduce concepts through examples before offering definitions
The Curse in Product Design
The curse of knowledge shapes how products get designed, and when designers are not aware of it, users suffer.
The classic example is software user interfaces. Engineers build tools for themselves -- menus organized around the internal architecture of the system rather than the mental models of users, error messages written in system language rather than plain terms, features buried in locations that make perfect sense to someone who knows the codebase.
Don Norman's concept of the gulf of evaluation in The Design of Everyday Things captures a related problem: when the feedback a system provides doesn't match the user's mental model, users get lost. But designers who built the system have already internalized the correct mental model -- they can't feel the gulf.
The consequences are measurable. Nielsen Norman Group usability studies routinely find that products designed without systematic user research fail on tasks that designers assume are trivial. In a classic example, a 2007 study of government website usability found that most tested websites failed on tasks that the agencies responsible for them rated as "easy" or "obvious." The agencies' experts could not perceive what their users could not find, because they could not simulate the user's unfamiliarity with the information architecture.
User research and usability testing exist in large part to break through this curse. When you watch a real user struggle with something you designed, it is difficult to maintain the illusion that the design is obvious. The discipline of participatory design -- involving intended users in the design process from early stages -- emerged partly as a structural response to the curse of knowledge: if expert designers systematically cannot simulate novice users, the solution is to include novice users directly.
Concrete Analogies: The Most Powerful Antidote
If there is one communication tool that most reliably breaks the curse of knowledge, it is the concrete analogy. Analogies work because they anchor unfamiliar concepts to structures the listener already has in their mind.
Consider how different these two explanations feel:
Abstract: "A neural network processes inputs through successive layers of weighted transformations, applying non-linear activation functions to produce distributed representations."
Concrete analogy: "Think of a neural network like a bureaucracy. Each department takes in a report from the department before it, puts its own stamp on it, and passes it on. By the time the report reaches the top, it has been filtered and summarized so many times that the final decision reflects accumulated judgment from the whole organization."
The analogy is imperfect -- all analogies are. But it gives the listener a foothold. And from that foothold, the more technical explanation becomes graspable.
The Heath brothers call this concrete language and identify it as one of the six principles of sticky ideas (alongside simplicity, unexpectedness, credibility, emotional resonance, and stories). Abstract ideas don't stick. Concrete images do.
Research by Donnelly and McDaniel (1993) provided experimental evidence for the advantage of analogical encoding: participants who received analogical explanations of complex scientific concepts showed significantly better transfer to novel problems than participants who received literal explanations, even when the literal explanations were more technically precise. The analogy sacrifices precision but gains comprehension and transferability -- exactly the trade that a curse of knowledge sufferer needs to make.
The art of good analogy-making is knowing where to map the familiar onto the unfamiliar -- which structural features are load-bearing for understanding and which are misleading distractions. This is a skill that can be developed: it requires both deep knowledge of the domain (to know what the structural features are) and sustained contact with novice misunderstandings (to know which features learners will map incorrectly).
The Curse of Knowledge in Leadership
Leaders are among the most chronically afflicted by this bias. A CEO who has been in the industry for twenty years sees the competitive landscape with a clarity that is genuinely invisible to new employees. A manager who has solved the same class of problem fifty times wonders why her team is struggling with what seems to her like an obvious issue.
This creates several failure modes:
Under-communication: Leaders assume that stating a direction once is enough. Their teams, lacking the context the leader carries unconsciously, need far more repetition and elaboration. Research by Kouzes and Posner found that the most effective leaders repeat key messages seven or more times before the message is reliably internalized by their teams. Leaders typically underestimate this need by a factor of 3 to 5.
Skipping the 'why': Because the strategic logic is obvious to the leader, they often communicate decisions without the reasoning behind them. This leaves employees feeling that change is arbitrary, which breeds resistance. A study by Staw (1984) on the "escalation of commitment" showed that people are far more committed to initiatives they understand the reasoning behind, even when the initiatives themselves are similar. Withholding the why -- because it seems obvious -- predictably reduces commitment.
Impatience with questions: Questions that seem basic to a leader are not basic to the questioner. Treating them as such signals that questions are unwelcome, which drives confusion underground. In a survey of 1,000 employees by the Harvard Business Review (2014), "unclear communication from leadership" was the most commonly cited factor contributing to workplace inefficiency -- outranking resource constraints, unclear processes, and poor technology.
Effective leadership communication requires consciously rebuilding the case from first principles every time a strategic message needs to be shared -- not because the audience is slow, but because the audience does not carry the same accumulated context.
Research on organizational knowledge management has found that tacit knowledge -- the accumulated expertise that is difficult to articulate because it has become automatic -- is the most valuable and least transferable form of organizational knowledge. Organizations lose disproportionate capability when experts depart, not because their documented procedures leave, but because their unchunked, unreconstructed understanding of why those procedures work goes with them. The curse of knowledge, at organizational scale, is a genuine risk management problem.
How to Overcome the Curse of Knowledge
No one permanently escapes the curse of knowledge, but it can be managed with deliberate practice.
Build and Maintain a Beginner's Journal
Keep a record of the moment you first encountered ideas that now feel obvious to you. What confused you? What analogy finally made it click? What question did you wish someone had answered? This journal becomes a reference point for designing explanations.
Test on Real Beginners
Nothing replaces sitting with someone who genuinely does not know what you know and watching them try to understand your explanation. Do not help them. Watch where they slow down. Watch what they misinterpret. Their confusion is not a failure of attention -- it is a signal about where your explanation assumes too much.
Research by Dow and colleagues (2010), studying design processes, found that teams who engaged in more user testing cycles produced designs that better matched user needs, even when the total design time was held constant. The finding was not that more time helps -- it was that feedback from real users helps, in ways that internal expert review cannot replicate.
Develop the Habit of Stepping Back
Before presenting, writing, or explaining, ask: what does this person already know? What do they not know? What will they find surprising? What analogy from their world could I use? This brief cognitive step is often all that separates a clear explanation from a confusing one.
Embrace the Power of Worked Examples
Research in cognitive science consistently shows that learners benefit enormously from worked examples -- demonstrations of a problem being solved step by step, with the reasoning made explicit. Experts tend to skip to solutions. Beginners need the journey.
Atkinson and colleagues (2000), in a meta-analysis of 10 worked example studies in mathematics and science, found a mean effect size of 0.42 in favor of worked examples over problem-solving practice for novice learners. The effect was strongest for beginners and decreased with increasing expertise -- consistent with the "expertise reversal effect," which finds that instructional methods optimal for novices become suboptimal or even counterproductive for experts.
Learn Outside Your Field
One of the most effective ways to rebuild sensitivity to the experience of not-knowing is to regularly learn things you know nothing about. Study a foreign language. Learn to play a musical instrument. Take a class in a field completely outside your expertise. The discomfort of genuine ignorance recalibrates your ability to empathize with learners.
Research on cross-domain expertise transfer suggests an additional benefit: learning in a genuinely new domain activates metacognitive awareness -- awareness of your own learning processes -- that can transfer back to the original domain. Experts who have recently been genuine beginners at something are better at recognizing and modeling the beginner experience.
The Deeper Implication: Knowledge as Isolation
There is something philosophically poignant about the curse of knowledge. Every time you master something, you move further away from the people who have not yet learned it. Knowledge -- the thing that connects us through shared understanding -- also has the potential to isolate us, to make genuine communication harder.
Camerer, Loewenstein, and Weber's original framing described this as a kind of cognitive taxation: knowing something has costs, not just benefits. The cost is borne not by the knower but by anyone who needs the knower to communicate clearly. The curse of knowledge is, in this sense, an externality: the expert bears the experience of fluency; others bear the consequences of failed communication.
The experts who communicate best are not those who know the least. They are those who have worked hardest to remember what it felt like to know nothing -- and who treat that imaginative act of remembering as a professional discipline, not an afterthought.
Richard Feynman used to say that if you could not explain something to a first-year student, you didn't really understand it. That standard is demanding. But it is also one of the most useful commitments any teacher, writer, designer, or leader can make.
The goal is not to dumb things down. It is to build a bridge -- and the curse of knowledge is what makes bridge-building hard. Naming it, understanding its mechanisms, and developing deliberate practices to counteract it is how the best communicators do their most important work.
References
- Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse of knowledge in economic settings: An experimental analysis. Journal of Political Economy, 97(5), 1232-1254.
- Heath, C., & Heath, D. (2007). Made to Stick: Why Some Ideas Survive and Others Die. Random House.
- Newton, E. L. (1990). Overconfidence in the communication of intent: Heard and unheard melodies. Unpublished doctoral dissertation, Stanford University.
- Nathan, M. J., & Petrosino, A. (2003). Expert blind spot among preservice teachers. American Educational Research Journal, 40(4), 905-928.
- Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14.
- Pinker, S. (2014). The Sense of Style: The Thinking Person's Guide to Writing in the 21st Century. Viking.
- Ericsson, K. A., Krampe, R. T., & Tesch-Romer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363-406.
- Birch, S. A. J., & Bloom, P. (2007). The curse of knowledge in reasoning about false beliefs. Psychological Science, 18(5), 382-386.
- Hattie, J. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. Routledge.
- Chi, M. T. H., de Leeuw, N., Chiu, M. H., & LaVancher, C. (1994). Eliciting self-explanations improves understanding. Cognitive Science, 18(3), 439-477.
- Sweller, J., & Cooper, G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59-89.
- Atkinson, R. K., Derry, S. J., Renkl, A., & Wortham, D. (2000). Learning from examples: Instructional principles from the worked examples research. Review of Educational Research, 70(2), 181-214.
- Donnelly, C. M., & McDaniel, M. A. (1993). Use of analogy in learning scientific concepts. Journal of Experimental Psychology: Learning, Memory, and Cognition, 19(4), 975-987.
Frequently Asked Questions
What is the curse of knowledge?
The curse of knowledge is a cognitive bias in which someone who knows a subject well finds it difficult to imagine not knowing it. Once you have mastered something, your brain permanently loses easy access to the beginner's perspective, making it hard to explain the topic in simple terms to someone who is learning it for the first time.
What is the tapping experiment that illustrates the curse of knowledge?
The tapping experiment, described by Chip and Dan Heath in 'Made to Stick', asked participants to tap out well-known songs on a table while a listener tried to identify them. Tappers predicted listeners would identify about 50% of songs, but listeners only got 2.5% correct. Tappers could 'hear' the full melody in their heads and couldn't understand why the listener could not.
How does the curse of knowledge affect teaching?
Expert teachers often skip foundational steps, use jargon without defining it, and assume students already understand prerequisite concepts. This expert blind spot causes confusion and disengagement in learners. Research shows novice tutors often outperform experts in certain teaching scenarios because they remember what it felt like not to know.
What is the Feynman Technique and how does it help?
The Feynman Technique, developed by physicist Richard Feynman, involves explaining a concept as if teaching it to a child, then identifying gaps in your explanation and returning to the source material. It is one of the most effective methods for breaking the curse of knowledge because it forces you to confront what you actually understand versus what you merely recognise.
How can writers and communicators overcome the curse of knowledge?
Effective strategies include using concrete analogies drawn from everyday experience, testing explanations on real beginners and watching where they get confused, reading widely outside your field to remember what genuine unfamiliarity feels like, and keeping a 'beginner's journal' that records the confusion you felt when you first encountered the topic.