Want to turn your NPS surveys into something more useful than a dashboard ornament? This guide is for anyone who’s tired of hand-waving around “customer loyalty” and actually wants to get real insights from Net Promoter Score (NPS) data in SurveyMonkey. Whether you’re on product, support, or just the person who got stuck with survey duty—let’s actually put those scores to work.
What’s NPS (and Why Bother)?
You probably know this already, but quick refresher: NPS (Net Promoter Score) is that “How likely are you to recommend us?” question, rated 0–10. It’s easy to run, but just staring at the number won’t magically tell you what to fix.
Why use NPS at all?
- It’s simple, and customers actually answer it.
- It’s a decent pulse check—but only if you dig into the “why,” not just the number.
- Stakeholders love a simple metric (even if they sometimes misuse it).
But—NPS isn’t gospel. It’s noisy, it can be gamed, and it won’t tell you exactly what’s broken. Treat it as a jumping-off point, not the final word.
Step 1: Get Your NPS Data Out of SurveyMonkey
First up: you need the data in hand, not just the summary chart.
If you’re using SurveyMonkey, here’s how to get what you need:
- Find Your Survey: Go to your SurveyMonkey dashboard and select your NPS survey.
- Export the Data: Click “Analyze Results.” Look for the “Export” button (usually top-right). Choose CSV or XLSX.
- Grab the Right Fields: Make sure you’re exporting:
- The NPS score (the 0–10 answer)
- Any open-ended follow-up (the “Why did you give that score?” question)
- Any metadata (timestamps, customer segments, etc.)
Pro Tip: If you’re running regular NPS surveys, keep exports organized by date. It’ll save your sanity later.
Step 2: Calculate the Actual NPS (Don’t Just Trust the Summary)
SurveyMonkey will give you an NPS number, but double-check it—especially if you’re slicing data by segments later.
Here’s how NPS is calculated:
- Promoters: 9–10
- Passives: 7–8
- Detractors: 0–6
NPS = (% of Promoters) − (% of Detractors)
Quick sanity check:
- Count the number of responses in each group.
- Do the math yourself (Excel or Google Sheets is fine).
- If you’re segmenting by region, product, etc., repeat for each group.
Why bother? Sometimes filters or exports in SurveyMonkey can miss data, or you’ll spot weird outliers the dashboard doesn’t show.
Step 3: Segment Your Results (This Is Where the Insights Hide)
A single NPS number is almost useless. Break it down by something that actually matters to your business.
Useful ways to segment:
- By customer type: New vs. long-term, paying vs. free
- By product line: If you offer more than one thing
- By region or market: Sometimes culture or language matters
- By channel: How they signed up or interact
How to do it:
- Use metadata from your survey (e.g., a hidden field for account type).
- Or, match NPS responses to your CRM/customer database if you can.
What to look for: Big gaps between groups. For example, if new users are all detractors but veterans are promoters, your onboarding probably stinks.
What not to do: Don’t go overboard slicing and dicing into tiny groups—you’ll just end up with noise.
Step 4: Dig Into the Open-Ended Feedback
This is the goldmine—if you actually read it.
How to approach it:
- Start by scanning all the comments. Look for themes. Don’t get distracted by that one angry outlier.
- Group feedback into buckets, like:
- Product bugs or feature requests
- Price complaints
- Customer service praise or rants
- “Just felt like giving a score, no comment” (ignore these)
Manual or automatic?
- If you have a few dozen responses, just read them and jot down tallies.
- If you have hundreds, consider a text analysis tool—or at least use search/find for keywords.
Don’t: Rely on word clouds. They’re flashy and mostly useless.
Do: Quote real feedback in your report. It helps decision-makers see the human side.
Step 5: Tie Scores and Comments to Business Actions
Now, connect the dots. High NPS but everyone complains about pricing? That’s a red flag. Low NPS with lots of “support was slow” comments? There’s your action item.
Ways to make it actionable:
- For each big theme, assign an owner (product, support, etc.).
- Prioritize issues by how often they come up—not just how loud the loudest customer is.
- Set a follow-up. Did you fix something? Run another NPS survey and see if it moves the needle.
What to ignore: Don’t chase every single feature request. Focus on issues that come up over and over (even if it’s not the “sexiest” problem).
Step 6: Present Your Findings (Without the Fluff)
Most NPS “reports” are just a number and a smiley face. You can do better.
- Show the breakdowns: Promoters, Passives, Detractors—by segment.
- Highlight real quotes: One from each group, if possible.
- Summarize top themes: Not every comment, just the big buckets.
- Recommend actions: Be clear. “Fix onboarding flow” beats “Improve customer experience.”
Skip: Slides full of charts nobody reads. One page is usually enough.
Step 7: Track Trends Over Time—But Don’t Obsess
A single NPS snapshot is just that—a moment in time. Trends are more useful, but don’t get caught up in small wiggles.
- Compare periods (month to month, quarter to quarter).
- Watch for big swings, not tiny bumps.
- If you made a change (launched a new feature, changed pricing), see if NPS or the feedback shifted.
Caveat: NPS can bounce around for random reasons (seasonality, who answered that week, etc.). Don’t let one bad survey send you into a panic.
What Works, What Doesn’t, and What to Skip
Works:
- Segmenting by real business factors.
- Reading (and categorizing) open-ended responses.
- Using NPS as a trigger for action, not just a KPI.
Doesn’t:
- Treating NPS as the One Metric That Matters. It’s not.
- Overanalyzing small changes or tiny groups.
- Chasing every suggestion.
Skip:
- Word clouds.
- Overly complex dashboards.
- Comparing your score to random industry “benchmarks” (these are often bogus).
Keep It Simple—Then Iterate
Analyzing NPS isn’t rocket science, but it’s easy to overcomplicate. Focus on finding trends, listening to real feedback, and acting on what matters. Then, run another survey a few months later and see if you actually moved the needle.
Keep it honest, keep it practical, and remember—no amount of PowerPoint polish will fix a broken product or bad service. Start small, stay skeptical, and let the real customer voice drive your next move.