Most teams obsess over their NPS number without really understanding what it tells them. They run the survey, get a score, put it in a dashboard, and move on. The number goes up, they celebrate. It goes down, they panic. Neither reaction is particularly useful.

NPS is a good metric. But only if you know how to read it.

The basics, quickly

NPS comes from one question: "How likely are you to recommend us to a friend or colleague?" on a 0–10 scale. Respondents split into three groups:

  • Promoters (9–10): your loyal fans
  • Passives (7–8): satisfied but not enthusiastic
  • Detractors (0–6): unhappy, and potentially telling others about it

The formula: NPS = % Promoters − % Detractors. Passives don't count. The range runs from −100 to +100.

What the numbers mean

ScoreWhat it means
Below 0More detractors than promoters. You have a real problem.
0–30Decent, but there's a lot of room to grow. Don't get comfortable here.
30–70Good. This is where most well-run businesses land.
70+Exceptional. Hard to sustain at scale. If you're here, protect it.

Above 0 is better than average. Above 50 is excellent for most industries. That's the short version.

Benchmarks are less useful than you think

I see teams spend hours researching industry benchmarks. In SaaS, the average is roughly 30–45. Healthcare runs around 60. Airlines are low. Consulting is high.

But here's the thing: your real benchmark is your own score last quarter. A company with a 25 that's been climbing from 10 is in better shape than a company sitting at 50 that used to be 65. The direction matters more than the number.

Regional differences exist too. European respondents tend to score 5 to 10 points lower than North American ones. Stricter consumer expectations, not worse service. Compare yourself to yourself.

Why scores fluctuate

NPS is a lagging indicator. Today's score reflects the last few months of customer experience, not yesterday.

Common causes of drops:

  • A rough product release that affected a lot of people at once
  • A price increase, especially one that felt sudden or unfair
  • Support quality slipping. Detractors skew heavily toward people who recently had a bad support experience.
  • Bad survey timing. Surveying right after an outage or right after a discount skews everything.

Track the trend, not the snapshot. Monthly direction is what matters.

Five things that actually improve NPS

1. Follow up with detractors fast

Reach out within 48 hours. Don't be defensive. Just ask what went wrong and what you're doing about it. Even when you can't fix the problem right away, a genuine response makes a difference. Teams that do this consistently see meaningful improvement.

2. Read the open-text responses

The score is a symptom. The comments are the diagnosis. Group them by theme, find the 2–3 root causes that keep coming up, and focus there. That's your improvement roadmap.

3. Pay attention to passives

7s and 8s are your biggest opportunity. They're already satisfied, they just aren't enthusiastic yet. Ask them directly: what would it take to move from "fine" to "great"? Small product or service tweaks can shift a large passive group upward.

4. Make it a team sport

Customer success can't fix product bugs. Marketing can't fix a support backlog. NPS improvement only works when every team that touches the customer experience is working from the same data toward the same goals.

5. Put your promoters to work

9s and 10s are already loyal. Thank them. Ask for reviews. Invite them into beta programs. Promoters who feel seen become even stronger advocates.

The point

A "good" NPS score is one that's moving in the right direction. The number itself is just a starting point. What makes NPS valuable is the discipline of asking, listening, and doing something about it. Consistently.