If you're sending cold emails or LinkedIn messages and just guessing what works, you're leaving money on the table. This guide is for founders, marketers, and SDRs who want to actually know which outreach messages land—and who are tired of spray-and-pray tactics. We'll walk through how to A/B test outreach sequences using Supersend, what to pay attention to, and what to ignore so you don't waste time on vanity metrics.
Why Bother A/B Testing Outreach?
Let's be honest: most outreach fails. Not because the product's bad, but because the messaging misses the mark. A/B testing lets you put real-world numbers behind your hunches about what works. If you're serious about go-to-market (GTM) strategy, you need to ditch the gut feeling and start experimenting.
A/B testing outreach isn't magic. It's just a way to get clear, unbiased feedback on your messaging, timing, and targeting. In Supersend, it's pretty straightforward—but only if you do it right.
Step 1: Set a Clear Goal Before You Touch a Tool
Don't just “test stuff.” Decide what you actually want to learn. Are you trying to:
- Improve reply rates?
- Get more meetings booked?
- Reduce bounce or unsubscribe rates?
- Test a new value proposition?
Pick one main metric. If you try to optimize everything at once, you’ll end up learning nothing.
Pro tip: For sales outreach, replies are usually a better signal than opens. Opens can be misleading (thanks, Apple Mail Privacy).
Step 2: Segment Your Audience (Don't Skip This)
You can’t run a fair test if you’re blasting different kinds of leads with the same message. Before you jump into Supersend, clean up your list:
- Group prospects by relevant traits (industry, job title, company size, etc.)
- Remove obvious mismatches (e.g., students when you sell to VPs)
- Make sure each segment is big enough to give you meaningful results (more on this later)
If your list is tiny, don’t expect statistically significant results—just look for big, obvious differences.
Step 3: Set Up Your Outreach Sequences in Supersend
Supersend lets you create and schedule automated outreach sequences across email, LinkedIn, and more. Here’s how to get your A/B test live:
- Draft Two Variants.
- Make one change at a time between Version A and Version B. (Subject line, call-to-action, intro line—pick one!)
-
Don’t test ten things at once, or you won’t know what moved the needle.
-
Create Two Sequences.
- In Supersend, duplicate your sequence or create two separate ones: “Sequence A” and “Sequence B.”
-
Make sure the only difference is the thing you’re testing.
-
Split Your Audience.
- Divide your contact list randomly into two groups.
- Assign Group 1 to Sequence A, Group 2 to Sequence B.
-
Supersend makes this easy, but double-check you don’t have bias (like all the CEOs in one group and interns in another).
-
Set Your Sending Schedule.
- Keep send times and days consistent for both groups.
- If you send one batch on Monday morning and the other on Saturday night, results will be skewed.
What to ignore: Don’t obsess over tiny formatting differences or emojis. Focus on core message changes.
Step 4: Run the Test (and Don’t Chicken Out Early)
Now, let it run. The urge to peek at early numbers can be strong—resist it. Give your test enough time and volume:
- Sample size matters: For cold email, aim for at least 100-200 prospects per variant (if you can).
- Timing: Run the test for a full cycle (at least a week, ideally two), depending on your cadence.
- Don’t tweak mid-test: Changing things midway ruins the experiment.
If your list is small, look for ~20-30 responses per variant before calling a winner. If you get 3 replies on A and 2 on B, you don’t have enough data to say anything useful.
Step 5: Measure What Actually Matters
Supersend gives you lots of stats. Most don’t matter. Focus on:
- Reply rate: Did people actually respond?
- Positive reply rate: Did you get meaningful conversations, not just “unsubscribe” or “not interested”?
- Meeting booked rate: If that’s your goal, track it.
What to ignore: - Open rate: With privacy changes, it’s unreliable. - Click rate: Only matters if your CTA is a link, and even then, replies are usually a better signal.
Pro tip: Download your results as a CSV and do the math yourself if you don’t trust the tool (I wouldn’t blame you).
Step 6: Analyze and Decide (Don’t Overthink)
Look for clear winners, not marginal differences. If A gets 12% replies and B gets 13%, that’s probably noise. But if A gets 8% and B gets 20%, you’ve got something.
Ask yourself:
- Is the winning variant meaningful enough to roll out?
- Did you learn something actionable, or just get noise?
- Are there unexpected patterns (e.g., one variant worked way better for a certain industry)?
Don’t: Announce victory based on tiny differences, or keep testing just for the sake of testing.
Step 7: Iterate—But Don’t Chase Your Tail
If you get a clear winner, roll it out and move on to the next thing to test. Don’t fall into the trap of endless tweaking. Outreach is about consistent improvement, not chasing perfection.
If your test was inconclusive, pick a bigger difference for your next test. Sometimes it takes a few rounds to see what actually matters.
What Works (and What Doesn’t) in Real Life
Works: - Testing radically different value props or CTAs - Personalizing intro lines vs. generic intros - Testing softer asks (“open to chatting?”) vs. hard CTAs (“book a call here”)
Doesn’t work: - Tweaking one word in your subject line hoping for miracles - Obsessing over send time (unless your audience is in wildly different time zones) - Testing stuff you can’t scale (hyper-personalization you can’t repeat at volume)
Common Mistakes to Avoid
- Changing too many things at once: You won’t know what worked.
- Ending the test early: One or two responses aren’t enough.
- Ignoring audience bias: Make sure groups are random and similar.
- Chasing open rates: Replies and meetings are where the action is.
Wrapping Up: Keep It Simple, Keep Moving
A/B testing outreach in Supersend isn’t rocket science, but it does take discipline. Pick one thing to test, measure real results, and don’t get distracted by shiny stats. The real winners are the folks who iterate, not the ones who overthink.
Start small, keep it simple, and let the data tell you what works. Then do it again. That’s how you actually optimize your GTM strategy—no hype, just results.