
Product Metrics That Matter: A PM's Guide to KPIs
How to identify your North Star metric, distinguish leading from lagging indicators, and avoid vanity metrics that distract from real impact.
Why Metrics Matter
What you measure shapes what you build. Good metrics focus teams on outcomes that matter. Bad metrics drive behavior that looks good but doesn't help users or the business.
The best PMs are metric-obsessed but metric-skeptical. They track data rigorously while questioning whether the metrics capture what actually matters. They know when to trust the numbers and when to trust their judgment.
The North Star Metric
Your North Star is the single metric that best captures the value your product delivers to users. It should be:
- Tied to user value
- Leading (predicts future success)
- Actionable (your team can influence it)
Examples
| Company | North Star |
|---|---|
| Airbnb | Nights Booked |
| Spotify | Time Spent Listening |
| Slack | Daily Active Users × Messages Sent |
Each captures the core value exchange between product and user.
Why It Matters
A good North Star aligns the whole organization. Marketing, product, engineering, and sales can all point to the same number. When tradeoffs arise, you ask: "Which option better serves the North Star?"
Leading vs. Lagging Indicators
| Type | Description | Examples |
|---|---|---|
| Lagging | Outcomes that happened | Revenue, churn, NPS |
| Leading | Predict future outcomes | Activation rate, engagement frequency, feature adoption |
Building Your System
Good metric systems include both:
- Lagging indicators show whether you're succeeding overall
- Leading indicators show what to work on
If activation is dropping, fix onboarding. If retention is strong but acquisition is weak, invest in marketing.
Input vs. Output Metrics
Output metrics measure results: conversion rate, revenue, NPS.
Input metrics measure activities that drive outputs: experiments run, features shipped, user interviews conducted.
How to Use Them
Teams should own output metrics but track inputs to diagnose problems. If conversion dropped, look at inputs:
- Did traffic quality change?
- Did we ship something broken?
- Did competitor actions shift the landscape?
Don't over-rotate on input metrics. Shipping more features doesn't guarantee better output metrics. Activity isn't impact.
Vanity Metrics to Avoid
Vanity metrics look impressive but don't indicate health:
- Total downloads
- Registered users
- Page views
They're easy to grow but don't translate to value.
Why They're Dangerous
- A million downloads means nothing if 900K users churned in week one
- A billion page views means nothing if users bounce immediately
- Social proof numbers (followers, likes) often don't correlate with business outcomes
Focus on metrics that represent meaningful engagement or value exchange.
Retention: The Metric That Matters Most
Retention—users coming back over time—is arguably the most important metric category:
- Strong retention = your product delivers ongoing value
- Weak retention = you're leaking users faster than you acquire them
How to Measure
Measure retention cohorts: What percentage of users from January are still active in February, March, April?
| Retention Level | What It Means |
|---|---|
| 50%+ plateau | Strong retention |
| <10% plateau | Weak retention |
Retention issues are existential. You can't outgrow a leaky bucket. Fix retention before investing heavily in acquisition—otherwise you're burning money.
Engagement Metrics
Engagement measures how actively users interact:
- DAU/MAU ratio
- Session frequency
- Actions per session
- Time spent
DAU/MAU Ratio ("Stickiness")
Compares daily to monthly actives. A ratio of 50% means the average user is active half the days.
| Product Type | Typical Range |
|---|---|
| Social media | 20-30%+ |
| B2B SaaS | 15-25% |
Don't chase engagement mindlessly. Some products are valuable precisely because they're low-engagement—you use them when you need them, not all day. Match your engagement expectations to your product's job.
Conversion and Activation
Conversion metrics track progress through funnels:
- Signup → Onboarded
- Trial → Paid
- Visitor → Registered
Each step where users drop off is an optimization opportunity.
The Activation Moment
Activation is users reaching the "aha moment" where they experience core value:
| Company | Activation Moment |
|---|---|
| Dropbox | Uploading a file |
| Slack | 2,000 messages in a workspace |
Identify your activation moment and measure the percentage reaching it.
If acquisition is strong but activation is weak, you're bringing in the wrong users or failing at onboarding. Debug the funnel before scaling acquisition.
Revenue Metrics
| Metric | What It Measures |
|---|---|
| MRR | Monthly Recurring Revenue |
| ARPU | Average Revenue Per User |
| LTV | Lifetime Value |
| CAC | Customer Acquisition Cost |
LTV/CAC Ratio
This is crucial. If it costs $100 to acquire a customer worth $50, you lose money scaling.
| Ratio | Assessment |
|---|---|
| 3:1+ | Healthy |
| <1:1 | Unsustainable—you're paying users to use your product |
Expansion Revenue
Track expansion revenue (existing customers paying more) vs. new revenue. Great products grow within accounts over time. If expansion is weak, your product might solve the initial problem but not grow with user needs.
Setting Metric Goals
Goals should be ambitious but achievable:
- 10% improvement → usually within reach through optimization
- 2x improvement → usually requires something new
Using OKRs
Structure metric goals with OKRs (Objectives and Key Results):
- Objective: Qualitative ("Delight new users")
- Key Results: Metrics ("Increase day-1 retention from 40% to 50%")
Track weekly and adjust if you're off track.
Baseline before setting goals. If you don't know your current retention, you can't set a meaningful target. Spend time instrumenting and understanding current performance before committing to improvements.
Metric Review Cadence
Weekly
Review key metrics. Build a dashboard with North Star, leading indicators, and health metrics. Spot anomalies early and investigate root causes.
Monthly/Quarterly
Do deeper analysis:
- Are cohorts improving?
- How do segments differ?
- What experiments moved metrics?
This is where you develop strategic insight, not just react to wiggles.
Annual
Question the metrics themselves:
- Is the North Star still right?
- Are we measuring what matters?
Metrics that made sense at Series A might be wrong at Series D.
Get the weekly digest of top product people & jobs
One email a week. No spam.