How to Overcome the Curse of Knowledge
The curse of knowledge is a cognitive bias where experts mistakenly assume others share their background knowledge. Once someone knows something deeply, they lose the ability to imagine what it feels like not to know it. This makes it surprisingly difficult to communicate with novices.
The term was coined in 1989 by economists Colin Camerer, George Loewenstein, and Martin Weber. Their research found that better informed agents could not ignore their private information, even when financial incentives pushed them to do so. The bias persisted across every condition they tested.
I mistook this for a long time as a communication problem. Something fixable by choosing better words or being more patient. It is not. It is a structural feature of how human cognition works.
The Tapping Study
The most well known demonstration came from Elizabeth Newton's 1990 doctoral study at Stanford.
She split participants into two groups: tappers and listeners. Tappers picked a well known song ("Happy Birthday," "The Star Spangled Banner") and tapped out the rhythm on a table. Listeners tried to guess the song.
Tappers predicted listeners would guess correctly about 50% of the time.
The actual success rate was 2.5%. Three correct guesses out of 120.
Tappers heard the full melody in their heads as they tapped. They could not shut it off. They could not experience the tapping the way listeners did: as a series of disconnected knocks with no melody or meaning.
Why It Is So Hard to Overcome
Three mechanisms make this bias persistent.
Knowledge cannot be suppressed. When reasoning about someone else's perspective, existing knowledge leaks in automatically. It cannot be set aside. The brain was designed to acquire and use knowledge, not to pretend it has not learned.
Familiarity feels like obviousness. When something comes to mind easily, the automatic assumption is that it comes easily for everyone. Psychologists call this fluency misattribution. A 2017 study with 359 undergraduates found that mere processing fluency was enough to trigger the bias. People did not even need to actually know something. If it felt familiar, they overestimated how widely known it was.
Self anchoring. When estimating what someone else knows, the starting point is always one's own knowledge. Adjustments from there are never large enough.
Camerer's original research tested whether awareness helps. Participants were told about the bias, warned to adjust, and given financial incentives to be accurate. None of it worked. People still overestimated what others knew.
Tullis and Feder's 2023 research added another layer. Before you learn something, you can use your own confusion as a signal. "I find this hard, so others probably do too." That works reasonably well. But once you learn the answer, that confusion disappears. You no longer feel uncertain about it. So you lose the one reliable clue you had for judging how difficult something is for someone else.
Where It Shows Up
Software Engineering
A senior engineer reviews a pull request and writes: "Why didn't you just use X?" The word "just" reveals the bias. What feels simple after years of experience is invisible to someone still building their mental models.
Documentation is a consistent failure point. Engineers write docs that make perfect sense to them but assume context the reader does not have. The docs skip foundational knowledge because that knowledge no longer registers as knowledge to the author.
Product Design
Builders know their product inside and out. Every button, every flow, every label makes sense to them.
A new user sees a screen full of options with no context. They do not know the mental model behind the layout. They do not know the terminology.
This is why companies build features nobody uses. The team assumed users would "get it" because the team got it. Xiong, Van Weelden, and Franconeri's 2019 research showed this extends to data visualization too. Chart creators assume viewers see the same patterns they see. But the same data tells different stories depending on what you know going in.
Teaching
A professor explains a concept they have known for twenty years. They skip the foundational steps because those steps feel so basic they barely register as knowledge anymore. The students are lost.
This problem is well documented enough that it spawned its own pedagogical framework. "Decoding the Disciplines" was developed specifically to force experts to articulate the implicit mental steps they take when solving problems. Steps that feel automatic to the expert but are invisible to the student.
Leadership
Leaders operate in a bubble of context. They attend every strategy meeting, every planning session, every executive review. They have the full picture. Their teams do not.
When a leader announces a decision without explaining the reasoning, they assume the logic is apparent. The team sees the what but not the why. Without the why, the decision feels arbitrary.
Research suggests hierarchy amplifies this. People with authority are less likely to realize they are being unclear. People without authority are less likely to signal confusion.
How to Overcome It
The bias cannot be eliminated entirely. But specific strategies have evidence behind them.
Generate Counterexplanations
This has the strongest experimental support. Before explaining something, actively reason about why someone might not understand it. Think about what assumptions you are making. Think about what context you have that they do not.
This works because it forces the generation of alternative perspectives rather than relying on passive adjustment. In experimental settings, counterexplanation has been shown to eliminate the bias.
Test With Real Novices
Not teammates. Not people who have been in the same meetings. Find someone with zero context and watch them interact with the product, read the document, or follow the explanation.
Their confusion overrides faulty intuitions about what is "obvious." This is why usability testing works. It provides direct evidence that bypasses the bias.
Use Fresh Eyes Quickly
New employees are temporarily immune. In their first two weeks, everything confusing to outsiders is still confusing to them. Ask them to flag every moment of confusion, every piece of jargon, every assumption that does not land.
This window closes fast. The bias sets in as soon as they start accumulating context.
Use Analogies
Connect unfamiliar domains to familiar ones. Do not explain load balancing with technical terms. Compare it to checkout lanes at a grocery store.
Analogies work because they build on knowledge the audience already has rather than requiring them to absorb new concepts from scratch.
Ask Before Explaining
Before presenting something, ask what the audience already knows. The answer is almost always less than expected. This simple step recalibrates the starting point and prevents over assumption.
Default to Over Explaining
Assume the audience does not share your knowledge unless there is strong evidence otherwise. Over explaining is almost always less costly than under explaining. The worst case of over explaining is mild redundancy. The worst case of under explaining is total confusion.
Bottom Line
The more someone learns, the worse they get at teaching what they know. The more expert they become, the harder it is to remember what it felt like to not understand.
It is easy to assume this is a maturity issue. Something people grow out of. Birch and Bloom's 2007 research showed otherwise. They tested adults on perspective taking tasks similar to those used with children and found that adults still get it wrong. Their own knowledge leaked into their judgments about what another person would believe. Getting older and smarter does not fix it. The bias stays for life.
The best communicators are not the most knowledgeable people in the room. They are the ones who have built habits to bridge the gap between what they know and what their audience knows.
Enjoyed this post?
If this brought you value, consider buying me a coffee. It helps me keep writing.