If you're sending cold emails and not getting replies, it's easy to blame your list, your offer, or just bad luck. But let's be honest: most folks aren’t testing their emails the right way, and that's why conversions stall. If you use Emelia for your cold outreach, this guide is for you. You'll learn how to actually test what works, spot real improvements (not just noise), and skip the rookie mistakes that waste time.
Why A/B Test Cold Emails, Anyway?
Cold email is a game of small wins. A subject line that bumps open rates by 3% can mean dozens more conversations. The trick is knowing why something works—not just guessing, then moving on. That’s where A/B testing comes in. But here’s the catch: most people either test the wrong thing, don’t wait for enough data, or get misled by lucky streaks.
Done right, A/B testing in Emelia is the closest thing to a “cheat code” for learning what actually gets replies (and what just clogs inboxes).
Step 1: Set Your Goal—And Be Specific
Before you even touch Emelia’s split testing features, nail down what you’re trying to improve. Vague goals lead to vague results.
- Open rates: Test subject lines and preview text.
- Reply rates: Test body copy, call-to-action (CTA), and personalization.
- Positive replies or meetings booked: Test value props, follow-up timing, and offer clarity.
Pro tip: Pick one metric to focus on at a time. If you try to boost everything, you’ll learn nothing.
Step 2: Design Your Variations (Don’t Change Everything at Once)
Here’s the biggest mistake: changing subject, intro, CTA, and signature all at once—then wondering why variation B “won.” Was it the subject line or the new sign-off? You’ll never know.
- Test one variable at a time: Only change the subject line, or just the opening sentence, for a single experiment.
- Keep everything else the same: No sneaky changes to tone, formatting, or offer.
- Write versions you actually believe in: Don’t A/B test a “bad” email just to see if it flops.
What to ignore: Wildly different templates. You’re looking for small, measurable improvements—not a total rebrand every week.
Step 3: Set Up the Test in Emelia
Emelia makes it simple to split test, but you need to think ahead. Here’s how to do it right:
- Create two (or more) versions within the same campaign.
- For subject line tests, just edit the subject.
- For body copy, keep subject and everything else the same.
- Confirm your sending settings.
- Use the same sender account(s) for all variations.
- Don’t mix lists or segments unless you want to muddy the results.
- Double-check for errors.
- Make sure merge fields/personalization tokens work in both versions.
- Send yourself a preview of each.
Pro tip: Label your variations clearly in Emelia—“Subject A: Direct” vs. “Subject B: Curiosity”—so you don’t forget which is which two weeks from now.
Step 4: Decide on Sample Size Before You Hit Send
Don’t trust results from just 20 or 50 emails. Cold outreach is noisy. A single “yes” can swing percentages wildly when your numbers are small.
- Aim for at least 100 sends per variation for anything meaningful.
- If your lists are small, run the test for multiple cycles before declaring a winner.
- Don’t stop the test early just because one version “looks good” on day two.
What doesn’t work: Declaring victory after a handful of opens or replies. That’s just luck, not learning.
Step 5: Watch the Right Metrics (And Don’t Get Distracted)
Emelia gives you a lot of numbers. Here's what to care about:
- For subject line tests: Open rate is king, but check deliverability too. A clever subject line that gets you flagged as spam is a fail.
- For body copy/CTA: Reply rate (or “positive reply rate,” if you’re tracking that) matters most.
- Watch for outliers: One big lead or a weird bounce can skew tiny tests.
Ignore: Vanity metrics like “delivered” or “clicked” (unless your CTA is a link). Focus on the stat that matches your goal from Step 1.
Step 6: Analyze Results Honestly
Once your sample size is decent, pull up Emelia’s stats and compare. But don’t just pick “the winner” and move on.
- Is the difference real? If variation A got 13 replies and B got 11, that’s probably just noise. Look for clear gaps—ideally 20%+ difference.
- Did you get enough data? If you’re not sure, run the test longer or combine with the next batch.
- Check for weird patterns: Did all the positive replies come from one company or domain? That’s not a trend, it’s just a fluke.
Pro tip: Keep a simple spreadsheet or doc with your tests and results. This is your “what actually worked” journal. You’ll avoid repeating old mistakes.
Step 7: Roll Out the Winner—But Keep Testing
If you see a real improvement, make the winning version your new default. But don’t stop there. Cold email is a moving target—what works today can flop next month.
- Test a new tweak next campaign: Small, focused changes win over time.
- Recycle your old losers: Sometimes a version that failed in January works in June (markets change, trends change).
- Don’t chase tiny improvements forever: If two versions perform about the same, move on to a new variable.
Honest Takes: What Works, What Doesn’t
What Actually Moves the Needle
- Personalization: Using the prospect’s name, company, or a recent event reliably boosts replies. But make it real—generic “I love your work at [Company]” lines don’t fool anyone.
- Clear, direct CTAs: Don’t be coy. If you want a call, ask for one. “Would you be open to a quick chat this week?” beats “Let me know your thoughts.”
- Shorter is better: Wall-of-text emails get ignored. Keep it tight.
What Usually Doesn’t Help (Despite the Hype)
- Overly clever subject lines: “Quick Question” is overused and gets ignored. “Re:” in subject lines when there’s no thread is just spammy.
- Excessive follow-ups: More than 3-4 follow-ups starts to annoy people and can hurt your sender reputation.
- Heavy HTML or images: Cold emails work best as plain text. Fancy designs trigger spam filters.
Common Pitfalls to Avoid
- Testing too many things at once: You’ll never know what actually worked.
- Getting impatient: Good tests take time and enough volume.
- Forgetting to measure what matters: If your goal is meetings booked, don’t obsess over open rates.
- Not documenting results: You’ll end up repeating failed tests if you don’t keep notes.
Wrapping Up: Keep It Simple, Stay Curious
A/B testing cold emails in Emelia doesn’t need to be complicated. Pick one thing to test, keep your changes focused, and don’t get distracted by hype or “email hacks.” Most breakthroughs come from small, steady tweaks and learning from your own numbers—not chasing the latest trend.
Stick with it, document what you learn, and remember: the only bad test is the one you never run. Good luck out there.