There is a peculiar irony at the heart of expertise. The more deeply you understand something, the harder it often becomes to explain it. Brilliant scientists give confusing lectures. Experienced engineers write documentation that only other engineers can follow. Veteran teachers watch students glaze over during explanations that seem crystal clear from the front of the room.
This is not laziness or arrogance. It is a predictable cognitive bias called the curse of knowledge — and understanding it is one of the most practically useful things you can learn about how the mind works.
What Is the Curse of Knowledge?
The curse of knowledge is a cognitive bias in which a person who has mastered a subject finds it genuinely difficult to imagine what it is like not to know that subject. Once your brain has fully integrated a concept, it loses easy access to the earlier state of not knowing. The knowledge becomes invisible to you — as automatic and unconscious as knowing that red means stop.
The term was coined in a 1989 paper by economists Colin Camerer, George Loewenstein, and Martin Weber, who used it to explain why informed traders sometimes make poor decisions in financial markets. But the concept quickly spread into education, communication, and organizational behavior, where its effects are even more visible.
"The problem is not that experts don't know enough. The problem is that they know too much — and they can't remember what it felt like to know nothing."
This is not a mild inconvenience. It fundamentally shapes how knowledge gets transmitted between people, how products get designed, how policies get written, and how organizations communicate with customers.
The Tapping Experiment: A Perfect Demonstration
No illustration captures the curse of knowledge more vividly than the tapping experiment, made famous by Chip and Dan Heath in their 2007 book Made to Stick.
The experiment was originally conducted by Stanford graduate student Elizabeth Newton for her doctoral dissertation in psychology. She divided participants into two groups: tappers and listeners. Tappers were given a list of 25 well-known songs — "Happy Birthday," "The Star-Spangled Banner," "Jingle Bells" — and asked to tap out the rhythm on a table. Listeners had to identify the song from the taps alone.
Before each session, Newton asked the tappers to predict how many songs the listener would correctly identify.
The results were striking.
| Metric | Result |
|---|---|
| Tappers' prediction of listener accuracy | ~50% of songs identified |
| Actual listener accuracy | 2.5% of songs identified |
| Songs tapped total (over the study) | 120 |
| Songs correctly identified by listeners | 3 |
Tappers were consistently baffled by the listeners' failures. From the tapper's perspective, the song was obvious — they could hear every note, every word, every chorus in their heads while they tapped. The listener had no access to any of that internal music. All they heard was a series of irregular clicks.
That gap — between the rich internal experience of the knower and the bare signal available to the learner — is the curse of knowledge in action.
Why Experts Lose the Beginner's Perspective
Understanding why this happens requires a brief look at how the brain processes and stores learned information.
Automaticity and Chunking
When you first learn to drive, you have to consciously manage the clutch, the gear shift, the mirrors, the steering wheel, and the road simultaneously. It is exhausting. After years of practice, all of those actions collapse into a single fluid behavior. Cognitive scientists call this chunking — the brain groups related pieces of information into a single unit that can be retrieved and executed as a whole.
Chunking is enormously efficient. It frees working memory for higher-order tasks. But it comes at a cost: once knowledge becomes chunked, the individual steps become inaccessible to conscious inspection. You can no longer easily explain why you check the mirror at precisely that moment or how you know the clutch is at the biting point — you just do it.
The same process happens with conceptual knowledge. An experienced software engineer no longer consciously thinks through why a recursive function needs a base case. A trained economist does not think step by step through opportunity cost. The knowledge is there, but it has been compressed into an intuition that bypasses the explicit reasoning that a beginner needs to see.
The Fluency Illusion
A related phenomenon is fluency illusion: when you can recognize something quickly — a formula, a word, a piece of code — your brain interprets that ease of recognition as genuine understanding. This is why students who reread their notes feel prepared for exams but often perform poorly. Familiarity masquerades as comprehension.
For experts, the fluency illusion cuts even deeper. They encounter terms and concepts that they have read, used, and discussed thousands of times. That extreme fluency makes the concepts feel so obvious that it genuinely cannot compute that someone else might find them opaque.
The Expert Blind Spot in Teaching
Educational psychologist Pamela Grossman coined the term expert blind spot to describe specifically how domain expertise interferes with effective teaching. Her research, later extended by Nathan and Petrosino (2003) in a widely cited paper, found that teachers with more subject-matter expertise sometimes performed worse on measures of pedagogical reasoning than those with less expertise.
This seems counterintuitive. But the explanation is consistent with everything we know about the curse of knowledge. Expert teachers are so far inside their subject that they have lost track of which parts are hard for beginners, which analogies are accessible, and which steps in a procedure can safely be skipped.
Nathan and Petrosino's research identified specific patterns in how expert blind spot manifests in the classroom:
- Skipping prerequisite steps: Experts assume background knowledge that students do not have
- Jargon without grounding: Technical terms are used before they are properly defined
- Abstract before concrete: Experts present formal definitions first and examples second, when learners need it the other way around
- Impatience with naive questions: Questions that seem foolish to experts are precisely the questions beginners most need to ask
There is also a striking counterintuitive finding: in some experiments, novice tutors outperformed expert tutors with struggling students. The novices remembered the confusion. They knew which parts were hard because those parts were recently hard for them.
The Feynman Technique: Confronting the Curse
Physicist Richard Feynman won the Nobel Prize in Physics in 1965. He was also one of the most celebrated teachers in the history of science — not despite his expertise, but because he developed deliberate strategies for breaking through his own curse of knowledge.
The method attributed to him, now commonly called the Feynman Technique, works in four steps:
- Choose a concept and write its name at the top of a blank page
- Explain it as if teaching a child — using simple words, no jargon, concrete examples
- Identify the gaps — wherever your explanation becomes vague, circular, or jargon-filled, you have found a gap in your own understanding
- Return to the source and study until you can fill the gap, then explain it simply again
What makes this method powerful is that it turns the curse of knowledge into a diagnostic tool. You are not asked to pretend you don't know things. You are asked to surface, through the act of attempted simple explanation, exactly which parts of your knowledge are genuine and which are the fluency illusion.
Feynman reportedly applied this to his own Nobel Prize-winning work on quantum electrodynamics. When he could not explain a concept to a first-year student, he took it as a signal that he did not fully understand it yet.
Curse of Knowledge in Writing
The curse of knowledge may be most damaging in written communication, because writing lacks the real-time feedback that makes conversation self-correcting. When you are talking to someone and they look confused, you can adjust. When you are writing, the reader's confusion is invisible to you.
Steven Pinker's book The Sense of Style (2014) dedicates significant attention to this problem. Pinker argues that most bad writing is not caused by poor grammar or weak vocabulary. It is caused by the writer's failure to simulate the reader's knowledge state. The writer knows what they mean. They assume the reader will too.
Pinker's prescription aligns with the Feynman approach: show your reasoning, don't assume it. Don't write "for obvious reasons" — state the reasons. Don't use an acronym without defining it. Don't assume the reader shares your frame of reference.
Practical writing strategies for overcoming the curse of knowledge include:
- Use the "stranger test": Would a smart person with no background in this field understand this sentence?
- Read your work aloud: Awkward passages often signal places where you have assumed too much
- Get a naive reader: Have someone outside your field read a draft and mark everything they find confusing
- Concrete before abstract: Introduce concepts through examples before offering definitions
The Curse in Product Design
The curse of knowledge shapes how products get designed, and when designers are not aware of it, users suffer.
The classic example is software user interfaces. Engineers build tools for themselves — menus organized around the internal architecture of the system rather than the mental models of users, error messages written in system language rather than plain terms, features buried in locations that make perfect sense to someone who knows the codebase.
Don Norman's concept of the gulf of evaluation in The Design of Everyday Things captures a related problem: when the feedback a system provides doesn't match the user's mental model, users get lost. But designers who built the system have already internalized the correct mental model — they can't feel the gulf.
User research and usability testing exist in large part to break through this curse. When you watch a real user struggle with something you designed, it is difficult to maintain the illusion that the design is obvious.
Concrete Analogies: The Most Powerful Antidote
If there is one communication tool that most reliably breaks the curse of knowledge, it is the concrete analogy. Analogies work because they anchor unfamiliar concepts to structures the listener already has in their mind.
Consider how different these two explanations feel:
Abstract: "A neural network processes inputs through successive layers of weighted transformations, applying non-linear activation functions to produce distributed representations."
Concrete analogy: "Think of a neural network like a bureaucracy. Each department takes in a report from the department before it, puts its own stamp on it, and passes it on. By the time the report reaches the top, it has been filtered and summarized so many times that the final decision reflects accumulated judgment from the whole organization."
The analogy is imperfect — all analogies are. But it gives the listener a foothold. And from that foothold, the more technical explanation becomes graspable.
The Heath brothers call this concrete language and identify it as one of the six principles of sticky ideas (alongside simplicity, unexpectedness, credibility, emotional resonance, and stories). Abstract ideas don't stick. Concrete images do.
The Curse of Knowledge in Leadership
Leaders are among the most chronically afflicted by this bias. A CEO who has been in the industry for twenty years sees the competitive landscape with a clarity that is genuinely invisible to new employees. A manager who has solved the same class of problem fifty times wonders why her team is struggling with what seems to her like an obvious issue.
This creates several failure modes:
Under-communication: Leaders assume that stating a direction once is enough. Their teams, lacking the context the leader carries unconsciously, need far more repetition and elaboration.
Skipping the 'why': Because the strategic logic is obvious to the leader, they often communicate decisions without the reasoning behind them. This leaves employees feeling that change is arbitrary, which breeds resistance.
Impatience with questions: Questions that seem basic to a leader are not basic to the questioner. Treating them as such signals that questions are unwelcome, which drives confusion underground.
Effective leadership communication requires consciously rebuilding the case from first principles every time a strategic message needs to be shared — not because the audience is slow, but because the audience does not carry the same accumulated context.
How to Overcome the Curse of Knowledge
No one permanently escapes the curse of knowledge, but it can be managed with deliberate practice.
Build and Maintain a Beginner's Journal
Keep a record of the moment you first encountered ideas that now feel obvious to you. What confused you? What analogy finally made it click? What question did you wish someone had answered? This journal becomes a reference point for designing explanations.
Test on Real Beginners
Nothing replaces sitting with someone who genuinely does not know what you know and watching them try to understand your explanation. Do not help them. Watch where they slow down. Watch what they misinterpret. Their confusion is not a failure of attention — it is a signal about where your explanation assumes too much.
Develop the Habit of Stepping Back
Before presenting, writing, or explaining, ask: what does this person already know? What do they not know? What will they find surprising? What analogy from their world could I use? This brief cognitive step is often all that separates a clear explanation from a confusing one.
Embrace the Power of Worked Examples
Research in cognitive science consistently shows that learners benefit enormously from worked examples — demonstrations of a problem being solved step by step, with the reasoning made explicit. Experts tend to skip to solutions. Beginners need the journey.
Learn Outside Your Field
One of the most effective ways to rebuild sensitivity to the experience of not-knowing is to regularly learn things you know nothing about. Study a foreign language. Learn to play a musical instrument. Take a class in a field completely outside your expertise. The discomfort of genuine ignorance recalibrates your ability to empathize with learners.
The Deeper Implication: Knowledge as Isolation
There is something philosophically poignant about the curse of knowledge. Every time you master something, you move further away from the people who have not yet learned it. Knowledge — the thing that connects us through shared understanding — also has the potential to isolate us, to make genuine communication harder.
The experts who communicate best are not those who know the least. They are those who have worked hardest to remember what it felt like to know nothing — and who treat that imaginative act of remembering as a professional discipline, not an afterthought.
Richard Feynman used to say that if you could not explain something to a first-year student, you didn't really understand it. That standard is demanding. But it is also one of the most useful commitments any teacher, writer, designer, or leader can make.
The goal is not to dumb things down. It is to build a bridge — and the curse of knowledge is what makes bridge-building hard. Naming it, understanding its mechanisms, and developing deliberate practices to counteract it is how the best communicators do their most important work.
Frequently Asked Questions
What is the curse of knowledge?
The curse of knowledge is a cognitive bias in which someone who knows a subject well finds it difficult to imagine not knowing it. Once you have mastered something, your brain permanently loses easy access to the beginner's perspective, making it hard to explain the topic in simple terms to someone who is learning it for the first time.
What is the tapping experiment that illustrates the curse of knowledge?
The tapping experiment, described by Chip and Dan Heath in 'Made to Stick', asked participants to tap out well-known songs on a table while a listener tried to identify them. Tappers predicted listeners would identify about 50% of songs, but listeners only got 2.5% correct. Tappers could 'hear' the full melody in their heads and couldn't understand why the listener could not.
How does the curse of knowledge affect teaching?
Expert teachers often skip foundational steps, use jargon without defining it, and assume students already understand prerequisite concepts. This expert blind spot causes confusion and disengagement in learners. Research shows novice tutors often outperform experts in certain teaching scenarios because they remember what it felt like not to know.
What is the Feynman Technique and how does it help?
The Feynman Technique, developed by physicist Richard Feynman, involves explaining a concept as if teaching it to a child, then identifying gaps in your explanation and returning to the source material. It is one of the most effective methods for breaking the curse of knowledge because it forces you to confront what you actually understand versus what you merely recognise.
How can writers and communicators overcome the curse of knowledge?
Effective strategies include using concrete analogies drawn from everyday experience, testing explanations on real beginners and watching where they get confused, reading widely outside your field to remember what genuine unfamiliarity feels like, and keeping a 'beginner's journal' that records the confusion you felt when you first encountered the topic.