If You're Not Measuring, You're Guessing
There's a pattern I see all the time in product teams, leadership meetings, and even personal projects: decisions made on vibes.
"I think users want this feature." "This blog post feels like it did well." "Our customers probably care about X."
The problem with vibes? They're often wrong. And worse, you'll never know they were wrong because you're not measuring anything.
As Jeff Bezos put it: "Things don't improve unless they're measured."
The Cost of Guessing
Every decision you make without data is a bet. Sometimes you win. But most of the time, you're optimizing for the wrong things, doubling down on what doesn't work, and ignoring what does.
I've watched teams spend months building features nobody asked for. I've seen marketing campaigns that "felt successful" but drove zero conversions. I've been guilty of this myself, assuming I knew what mattered without ever checking.
The thing is, data doesn't have to be complicated. You don't need a data science team or a fancy BI tool. You just need to measure something and then actually look at it.
The Feature Factory Trap
Without measurement, teams often default to building more and more features, hoping something sticks. Marty Cagan, founder of Silicon Valley Product Group, has a name for this behavior: feature factories.
"Teams today are all too often feature factories, with little regard for whether or not the features actually solve the underlying business problems. Progress is measured by output and not outcome."
Output is easy to count. Features shipped. Lines of code. Releases per quarter. But output isn't impact. You can ship a hundred features and still not move the needle on what actually matters.
A Personal Example: This Blog
This doesn't just apply to product teams. Even something as simple as a blog can fall into the same trap.
I'll be honest, when I started this journal, I wrote for myself. I didn't really care if anyone read it. But as I kept writing, I got curious. Which posts resonate? What topics bring people back? Am I just shouting into the void?
So I added analytics. Specifically, I use Cloudflare Web Analytics on this site.
Why Cloudflare? A few reasons:
- Privacy-first: It doesn't use cookies or track individual users. I get aggregate insights without compromising reader privacy.
- Lightweight: No heavy JavaScript bundles slowing down the site. It's a tiny script that doesn't impact performance.
- Simple: I don't need granular user tracking. I want to know what content performs, not build advertising profiles.
Here's the thing: the data surprised me.
Posts I thought would be popular? Moderate traffic. Posts I almost didn't publish? Some of my most-read pieces.
I can see:
- Which posts get the most views over any time period
- Where traffic comes from: direct visits, search, social
- How engagement trends over time: is a post evergreen or a one-day spike?
This isn't just vanity metrics. It shapes what I write next. If a topic on first principles thinking gets sustained interest, maybe there's more to explore there. If a post I was excited about lands flat, I ask why. Was the title unclear? The topic too niche? Did I bury the lede?
Data turns assumptions into insights.
How Data Powers Better Decisions
The principle scales beyond blogs. In product teams, some activities literally cannot function without data.
A/B Testing
A/B testing without data is just... changing things randomly and hoping for the best.
The entire premise of A/B testing is comparison: show version A to one group, version B to another, and measure which performs better. Without measurement, there's no "better." There's no winner. You're just flipping a coin and calling it strategy.
I've seen teams "A/B test" by launching a new design and asking around if people like it. That's not a test. That's a survey of whoever happened to respond, filtered through their desire to be polite or contrarian. Real A/B testing requires:
- Statistical significance: Enough data points to know the difference isn't random noise
- Clear success metrics: What does "better" actually mean? Clicks? Conversions? Time on page?
- Controlled conditions: Ensuring both groups are comparable, not skewed by timing or audience differences
Without data infrastructure to capture and compare results, A/B testing is theater.
Requirements Gathering
"The customer said they want feature X."
Great. But did you measure how many customers said that? Did you track how they currently work around the absence of X? Did you observe what they actually do versus what they said they'd do?
Requirements gathering feels qualitative: interviews, surveys, feedback sessions. But good requirements gathering is deeply quantitative:
- Frequency: How often does this problem come up? Is it one loud customer or a thousand quiet ones?
- Impact: When users hit this pain point, what happens? Do they churn? Complain? Find a workaround?
- Validation: Once you ship the requirement, did usage match expectations? Did it solve what you thought it would?
Without data, requirements become a game of who speaks loudest. The most articulate stakeholder wins, regardless of whether their request represents real user needs. Data democratizes input. It lets you hear from the silent majority who never email product managers but vote with their behavior.
Usage Tracking
This one is almost tautological, but it's worth stating: you cannot understand how your product is used without tracking how it's used.
Sounds obvious. Yet I've worked on products where the answer to "how many people use feature Y?" was "I don't know, let me ask around." Where major features were built on assumptions about user workflows that had never been validated. Where deprecation decisions were made based on developer intuition, not usage data.
Usage tracking answers fundamental questions:
- What features are actually used? Not what you built, not what's in the docs, but what do users actually click on, day after day?
- Where do users get stuck? Drop-off points, error rates, rage clicks. The friction you can't see by looking at your own polished demo.
- What's the user journey? How do people actually move through your product? Is it the happy path you imagined, or something completely different?
Without usage data, you're designing for imaginary users. You might get lucky. But you're fundamentally guessing about people whose behavior you've never observed.
The Trap of Over-Measurement
Now, a caveat. Data-driven doesn't mean data-obsessed.
I've also seen teams drown in dashboards. Fifty metrics, none of them actionable. Analysis paralysis where every decision requires a three-week study. That's not data-driven, that's data-burdened.
The goal is signal, not noise. Pick a few metrics that actually matter. For this blog, I care about:
- Total views (is anyone reading?)
- Top posts (what resonates?)
- Traffic sources (how do people find me?)
That's it. Three things. Anything more would be distraction for a personal site.
The right amount of measurement depends on the stakes. A personal blog? Keep it simple. A product serving millions? You need more rigor. But even then, ruthless prioritization of what to measure beats comprehensive measurement of everything.
Start Somewhere
If you're not measuring anything right now, start small:
- Running a website? Add basic analytics. Cloudflare, Plausible, even simple server logs.
- Building a product? Instrument one key user action. Just one.
- Leading a team? Track one metric that matters. Shipping velocity, bug rate, customer satisfaction. Pick one.
You can always add more later. But you can't learn from data you never collected.
And remember: data informs decisions, it doesn't make them. Pair your metrics with human judgment. Numbers tell you what is happening. Talking to users tells you why.
Measure. Learn. Adjust. It's not glamorous, but it's how you stop optimizing for the wrong things.
Better yet, with AI-assisted development, the cost of execution has dropped and the speed of iteration has increased. Leverage that. Build, measure, learn, and iterate faster than ever before.
Enjoyed this post?
If this brought you value, consider buying me a coffee. It helps me keep writing.