Let’s be honest: most “outbound email optimization” advice is either way too technical or so fluffy it’s useless. If you’re here, you want real, practical steps for running A/B tests that actually tell you what works—and what doesn’t—in your cold email outreach. This guide is for people using Lemlist, who want to get better results without wasting time on pointless experiments or endless dashboard tweaking.
Whether you’re doing sales, partnerships, or just trying to get more responses, here’s how to set up A/B tests in Lemlist, what matters, and the mistakes to skip.
Why Bother with A/B Testing in Lemlist?
If you’re still sending the same email to everyone, you’re leaving money (or at least replies) on the table. But not all A/B tests are worth your time. The right tests help you:
- Figure out what actually gets replies, not just opens.
- Stop guessing about subject lines and calls to action.
- Improve your results in a way that’s easy to measure.
Lemlist makes it pretty simple, but the real trick is knowing what’s worth testing and how to read the results honestly.
Step 1: Get Clear on What to Test—and What to Ignore
Before you touch Lemlist, decide what you actually want to learn. Don’t just test for the sake of testing.
The Good Stuff to Test
- Subject lines: The first thing people see.
- Opening line or first sentence: Sets the tone.
- Call to action: What you’re actually asking for.
- Email length: Short and sweet vs. a bit more detail.
- Personalization tokens: Does using their company name or a recent event help?
Don’t Bother Testing (At Least Not Yet)
- Tiny tweaks: Swapping a single word rarely moves the needle.
- Multiple changes at once: You won’t know which change mattered.
- Fancy HTML vs. plain text: Unless your audience expects design, stick to plain text.
Pro tip: Start with the subject line or first sentence—these usually make the biggest difference.
Step 2: Prepare Your Contact List
A/B testing means nothing if your test groups are wildly different. Garbage in, garbage out.
- Keep it apples-to-apples: Make sure your contact list is as similar as possible—same industry, job level, company size, etc.
- Don’t mix old and new leads in the same test: Fresh contacts behave differently.
- Aim for at least 50 contacts per version: Any less and you’ll get random noise instead of real insight.
If your list is small, focus on one simple test at a time and wait until you have enough data.
Step 3: Set Up Your Campaign in Lemlist
Now the hands-on part. Lemlist’s tools are straightforward, but the interface isn’t always self-explanatory.
-
Create a new campaign.
- Click “Campaigns” in the left sidebar.
- Hit “+ New campaign.”
-
Import your contact list.
- Upload your CSV or sync with your CRM.
- Double-check your mapping—especially the personalization fields.
-
Write your email sequence.
- Go to the “Emails” tab in your campaign.
- Build your first email (Version A).
- Don’t add follow-ups yet—test your first email before you complicate things.
-
Add a variation for A/B testing.
- Click “Add a variation” under your first email step.
- You’ll see Version A and Version B side by side.
- Edit Version B with your alternate subject line, intro, or CTA—whatever you’re testing.
Note: Lemlist splits the contacts between the variations automatically. No need to fuss over this.
Step 4: Set Your A/B Test Details
- Decide how many variants: Just stick to two at first (A and B). More versions = more confusion, less useful data.
- Check the split: Lemlist will split your contacts 50/50 by default. Don’t change this unless you have a good reason.
- Double-check your personalization: Preview both versions for a few contacts to catch any “Hi {{FirstName}},” fails.
Step 5: Launch and Monitor
-
Start your campaign.
- Hit “Launch campaign.”
- Emails will start sending according to your schedule.
-
Don’t peek too early: Wait until at least half your emails have been sent before you look at results. Early data is noisy and unreliable.
-
Track your key metrics:
- Open rate: Did your subject line grab them?
- Reply rate: The only metric that really matters.
- Bounce rate: If it’s high, your test is pointless—fix your list first.
-
Ignore click rates unless your CTA is actually to click a link (most outbound isn’t).
Pro tip: Only one variable per test. Otherwise, you’ll never know what caused the change.
Step 6: Review Results Honestly
Here’s where most people mess up. Don’t declare victory because one email got two more replies than the other. Look for real patterns.
- Did one version get at least 30% more replies? If yes, you’ve probably found a winner.
- If results are close, it’s a tie: Don’t overthink it. Try something new next time.
- Check for weirdness: Did one version go out on a Friday afternoon or after a holiday? Timing can skew results.
If you’re not sure, run the test again with a new batch. One-off results can be misleading.
Step 7: Roll Out the Winner and Iterate
Once you’ve found a clear winner:
- Make that version your new default.
- Now test something else—maybe a new CTA or a different opener.
- Don’t run a dozen tests at once. One at a time means you actually learn what’s working.
And seriously, keep your experiments simple. The more moving parts, the harder it is to know what’s working.
What Actually Matters (and What Doesn’t)
Worth Your Time
- Testing different angles: Subject lines, CTAs, and personalization.
- Keeping your list clean: No tool or test can save you from a bad list.
- Following up: If no one replies, try a second email—don’t hammer them with daily messages.
Not Worth Obsessing Over
- Tiny formatting tweaks: Italics vs. bold won’t change your results.
- Open rates (alone): High opens but no replies? Your message is off, not your subject.
- Vanity metrics: Don’t fall for the dashboard dopamine hit.
Pro tip: A/B testing is for learning, not just “winning.” Sometimes you’ll learn that both versions stink and you need a new approach.
Common Mistakes to Avoid
- Testing too many things at once: You’ll never know what worked.
- Too few contacts: Small sample = random results.
- Getting discouraged by a “losing” test: It’s just data. Most tests “fail.”
- Changing your test mid-campaign: Set it up, leave it alone, and review after.
Keep It Simple and Keep Moving
A/B testing in Lemlist isn’t magic, but it’s the fastest way to stop guessing and start improving your outbound emails. Start small, test one thing at a time, and don’t get lost in the weeds. Most “optimization” is just about trying new ideas, keeping what works, and ditching what doesn’t.
You don’t need to be a data scientist. Just be curious, be honest with your results, and keep iterating. That’s how you get more replies—and better conversations—in less time.