How to use Walnut analytics to identify high converting demo elements

If you’re reading this, you probably spend way too much time wondering if your product demos are actually working—or if prospects are just zoning out halfway through. You want to know which parts of your demo make people buy, and which parts make them click away. This guide is for anyone running sales demos with Walnut (here’s what I mean: Walnut), and who’s tired of guessing what’s working.

We’ll walk through, step by step, how to use Walnut’s analytics to find out which demo elements turn browsers into buyers. You’ll get real talk about what’s useful, what’s just noise, and how to actually use the data to improve your demos. No fluff—just what you need to know.


Step 1: Get Familiar With What Walnut Tracks

Before you dive in, take a minute to understand what Walnut’s analytics actually measure. Here’s what you’ll see (at least as of early 2024):

  • Demo views: Who watched, when, and how many times.
  • Step-by-step drop-off: Where viewers stop engaging or exit entirely.
  • Time spent per section: How long folks pause on each part of your demo.
  • Button and element clicks: Which interactive elements get attention.
  • Conversion events: If you’ve set up triggers (like “Book a call” or “Request access”), Walnut can track when these happen.

What to ignore: Don’t get distracted by “vanity metrics” like total views. If 100 people start your demo but never finish, that’s not a win.

Pro Tip: Set up your demo so each key feature or value prop is its own step or section. That way, you can tell exactly where people pay attention—or tune out.


Step 2: Set Clear Goals for Your Demo

Don’t bother sifting through analytics if you don’t know what you’re looking for. Ask yourself:

  • What does a “conversion” mean for this demo? (Booking a meeting, signing up, requesting a trial?)
  • Which features are must-see for your ideal customer?
  • Are there demo sections you think are important but aren’t sure about?

Jot down these goals and keep them handy. Otherwise, you’ll end up chasing random numbers instead of useful insights.


Step 3: Map Demo Steps to Measurable Elements

Walnut works best if your demo is structured logically. Go through your demo and map out:

  • Each step or screen (e.g., “Dashboard overview,” “Reporting feature,” “Integrations page”)
  • Key interactive elements (e.g., “Try it now” button, “Upload file” feature)
  • The conversion trigger (where you want the viewer to take action)

Why bother? Because vague demo steps make for vague analytics. The more clearly you can connect a demo step to a real product benefit, the easier it’ll be to spot what actually converts.


Step 4: Pull Up the Analytics Dashboard

Time to get your hands dirty. In Walnut, open up the analytics dashboard for your demo. Focus on these reports:

  • Step-by-step drop-off: Shows where people leave or lose interest.
  • Time spent per section: Longer isn’t always better—sometimes it means confusion.
  • Click maps: Visual heatmaps of where viewers interact (and where they don’t).
  • Conversion event tracking: Who actually completed the goal action.

Don’t try to look at everything at once. Start with high-level patterns:

  • Where are the biggest drop-offs?
  • Which sections do people linger on before converting?
  • Are there parts people rush through (or skip entirely)?

Honest take: Don’t panic if your “coolest” feature gets ignored. That’s feedback, not failure. Better to know now than keep pushing a dud.


Step 5: Identify High-Converting Demo Elements

Here’s what you’re actually looking for:

  • Sections where conversion rates spike: Maybe after showing a reporting dashboard, more people click “Book a call.”
  • Features that get lots of engaged clicks (and not just random pokes): If everyone tries your “Export to PDF” feature, that’s a signal.
  • Steps with low drop-off + high conversions: These are your money-makers.

What’s less useful: - Elements that get clicks but don’t lead to conversions. That might mean curiosity, not actual value. - Sections with high time spent but no follow-up action. Viewers could be confused, not impressed.

Example Patterns to Spot

  • If 70% of viewers drop off before reaching your “Integrations” slide, it’s not pulling its weight.
  • If most conversions happen after the “Automation” feature, that’s your star—consider moving it earlier in the demo.
  • If people spend a long time on one step but don’t click anything, your messaging might be unclear.

Step 6: Test Changes (and Don’t Overthink It)

Once you’ve spotted your high and low performers, tweak your demo:

  • Move high-converting elements earlier to capture attention.
  • Cut or condense sections that cause drop-off.
  • Reword or clarify steps that confuse people (long pauses with no clicks are a red flag).
  • Add or highlight features that actually get engagement—not just the ones you’re proud of.

Pro Tip: Only change one or two things at a time. Otherwise, you won’t know what made the difference.

Let the new version run for a bit, then check the analytics again. Rinse and repeat. This isn’t glamorous, but it works.


Step 7: Share Insights With the Right People

Don’t keep this to yourself. Share what you find with your sales or product team:

  • Which demo sections drive action?
  • What features get ignored or misunderstood?
  • Where are people still getting stuck?

Skip the 20-slide PowerPoint. A couple bullet points with screenshots is plenty.

Why bother? This is how you get buy-in for cutting filler or doubling down on what works. It’s also how you avoid endless “gut feeling” arguments about what matters.


Step 8: Watch for False Positives (and Don’t Chase Every Metric)

Not every spike in the data means you’ve found a gold mine. A few things to watch out for:

  • Curiosity clicks: Just because people click something doesn’t mean it’s valuable. Look for clicks that lead to conversions, not just random poking around.
  • Bot traffic or internal testing: Filter out test views and internal users, or your numbers will get skewed.
  • Short-term bumps: Don’t declare victory after one good week—look for patterns over time.

Rule of thumb: If you can’t explain why an element is converting, dig deeper before making big changes.


Step 9: Automate What You Can, But Don’t Blindly Trust “AI Insights”

Walnut has started rolling out more automated “insight” features—think machine-generated suggestions about which demo elements matter. These can be helpful for spotting trends you might miss, but don’t turn your brain off. AI isn’t magic.

  • Use automated insights as a starting point, not the final word.
  • Always double-check if the “recommended changes” actually line up with your goals and customer feedback.

Pro Tip: The best insights still come from talking to real prospects. Use the analytics to guide those conversations, not replace them.


Step 10: Keep It Simple and Iterate

Don’t overcomplicate things. The goal is to find what works, double down on it, and trim the rest. Walnut’s analytics are powerful, but they’re just tools. Here’s what actually matters:

  • Structure your demo so it’s easy to measure.
  • Focus on conversion events, not vanity metrics.
  • Make small, testable changes.
  • Use the data to have smarter conversations with your team.

Most of all, don’t try to make your demo perfect on the first go. The best demos are always being tweaked, based on what actually works—not what you hope will work.

Bottom line: Keep it simple, iterate often, and trust the data over your pet theories. That’s how you turn demo insights into real results.