You Can Only See What You Know

-8 min read
#strategy#ai

There's a cognitive bias that doesn't get talked about enough. It's not confirmation bias, though it's related. It's not attention blindness, though it looks similar. It's more fundamental than both.

You can only perceive what your existing knowledge allows you to see.

This isn't about focus or attention. It's about perception itself. If you don't have a mental model for something, you literally cannot see it. It doesn't register. It's not that you're ignoring it. It's that it doesn't exist in your reality.

An engineer solves a scaling problem with caching at their last company. It works brilliantly. Now at a new company, they face a performance issue and immediately propose caching. But the real problem is an O(n²) algorithm buried in a loop. Caching won't help. They can't see the algorithmic issue because their mental toolkit doesn't include it. They see the problem through the lens of what worked before.

An engineer builds web crawlers to extract data from external websites. They get good at it. Now the team needs data from an internal system. They propose crawling. But it's an internal system. They could just query the database directly or call an API. The engineer can't see those options as clearly because crawling is what they know. The hammer sees everything as a nail.

This has massive implications for decision making, leadership, and why we trust the wrong people.

The Experience Trap

We tend to believe that experience equals better perception. Someone with 20 years in an industry must see more than someone with 2 years. They've pattern-matched thousands of situations. They've developed intuition. They're the expert.

But here's what we miss: experience creates a lens, and lenses have blind spots.

Years of experience means years of reinforcing certain mental models. It means getting very good at seeing specific patterns. But it also means developing increasingly rigid frameworks for interpreting reality.

Ten years of experience can mean ten years of the same blind spots.

The expert sees patterns faster. They reach conclusions quicker. They feel more confident. But confidence comes from pattern recognition, and pattern recognition only works within the patterns you know. Step outside those patterns, and the expert is just as blind as anyone else. Sometimes more so, because they don't know they're blind.

This is why experienced people make bad decisions. Not because they lack intelligence or information. But because they cannot perceive the parts of the problem that exist outside their mental models.

You Don't Know What You Don't Know

This is the part that makes everything worse.

There are things you know you know. There are things you know you don't know. But the dangerous category is the things you don't know you don't know. The unknown unknowns.

When you know you don't know something, you can ask questions. You can seek expertise. You can be cautious. The gap in knowledge is visible to you, so you can address it.

But when you don't know what you don't know, there's nothing to address. No question to ask. No gap to fill. The missing piece doesn't even register as missing.

An expert appears confident because their mental model feels whole, even when it’s missing pieces.

The blind spot isn't visible as a blind spot. It's just... nothing. An absence that doesn't announce itself. You can't be cautious about something you don't know exists.

The scariest part? The more expertise you develop, the more your picture feels complete. You've filled in so many gaps. You've answered so many questions. The remaining blind spots feel smaller and smaller, even as some grow larger.

Why We Follow Blindly

Here's where it gets dangerous.

We assume experienced people see more. We defer to their judgment. We trust their confidence. But that certainty comes from pattern recognition within a limited frame. The leader isn't lying or being arrogant. They genuinely believe they see the full picture.

Authority bias plus perception bias is a dangerous combination.

We're wired to defer to expertise and seniority. When someone with credentials and experience makes a call, we follow. We assume they're operating with more complete information. But the expert's lens might be exactly wrong for this particular problem, and no one questions it because confidence and experience tend to travel together.

Groupthink Accelerates the Problem

Teams build shared frameworks and mental models. This feels productive. Alignment is efficient. Communication is easier when everyone shares assumptions.

But shared knowledge means shared blind spots. If everyone on the team has the same background, the same training, the same mental models, they will all fail to perceive the same things.

The "outsider" who sees something different isn't smarter. They just have a different lens. What's invisible to the team is obvious to them, and vice versa. This is why cross-functional input catches what specialists miss. Not because generalists are better, but because their blind spots don't overlap with the team's.

Lessons for Leaders

If you're in a leadership position, this has direct implications for how you operate.

Your confidence is inversely correlated with your accuracy on novel problems. The more certain you feel, the more likely you're pattern-matching to something familiar. That's useful when the problem actually is familiar. It's dangerous when the problem only looks familiar.

Your seniority silences the room. When you speak first or speak confidently, you're not just sharing your view. You're anchoring everyone else's perception. The junior person who sees something different now has to argue against authority, not just offer an alternative. Most won't.

Hire for different lenses, not culture fit. Culture fit often means "thinks like us." That's comfortable. It's also how you build a team with perfectly overlapping blind spots. The person who annoys you by seeing things differently might be the only one who can see what you can't.

Ask questions before making statements. Once you've declared your view, you've collapsed the possibility space. People will either agree or disagree with your position. They won't generate genuinely different perspectives. If you want to see what others see, you have to ask before you anchor.

Treat disagreement as data, not resistance. When someone pushes back, the instinct is to explain why they're wrong. But if they're seeing something you can't see, no amount of explanation will help. Instead of defending your position, get curious about their lens. What knowledge do they have that makes this look different to them?

Build decision processes that don't rely on you being right. You will be wrong. Not occasionally. Regularly. On things you feel certain about. Design systems that surface dissent, require devil's advocates, and delay closure until multiple perspectives have been heard.

Before finalizing, ask: did I hear from everyone? Not just the people who spoke up. The quiet ones. The junior ones. The ones whose lens is most different from yours. Silence isn't agreement. It's often the sign that someone sees something but doesn't feel safe saying it.

The best leaders aren't the ones who see the most. They're the ones who know they can't see everything and build accordingly.

Using AI to Expand Perception

This is where AI becomes genuinely useful for decision making. Not as a faster calculator, but as a different lens.

Andrej Karpathy built something called LLM Council. The concept is simple: instead of asking one AI model for an answer, you ask multiple models. Then you have them review and critique each other's responses. Finally, a "chairman" model synthesizes everything into a final answer.

Why does this work? Each model has different training. Different data. Different weights. In essence, different knowledge, which means different perception, which means different blind spots.

This mirrors what we need in human decision making. Not just multiple opinions, but multiple lenses. People who see the problem differently because they know different things.

Everyone needs an LLM Council in their life. Not necessarily the technical implementation, but the principle: multiple perspectives that don't share your blind spots. It's meta-cognition outsourced. A way to think about your thinking when you can't see the limits of your own thinking.

Takeaway

Experience is valuable. Expertise matters. Pattern recognition is powerful.

But none of these make you a complete perceiver. They make you a specialized perceiver. Very good at seeing certain things. Completely blind to others.

The more confident the expert, the harder the failure. They can't see what they're missing, and neither can anyone who learned from the same playbook.

Before trusting experience, ask what that experience might be blind to. Before following the expert, consider what their expertise prevents them from seeing. Before making the call yourself, question what your own knowledge makes invisible.

You can only see what you know. The question is whether you know what you cannot see.

Enjoyed this post?

If this brought you value, consider buying me a coffee. It helps me keep writing.