If you’re sending outbound emails and just hoping for the best, you’re leaving results—and maybe your sanity—on the table. AB testing is the easiest way to figure out what actually works, but if you’ve never set it up, it can feel like a hassle. This guide is for anyone using Hothawk who wants to run real AB tests on their outbound emails, get usable data, and skip the usual headaches.
Why bother with AB testing in outbound emails?
Let’s be blunt: Most outbound emails don’t get read, let alone answered. AB testing lets you stop guessing which subject lines, copy, or calls to action actually move the needle. With a little setup, you’ll know what works for your audience—instead of just copying what some “guru” said on LinkedIn.
But, here’s the catch: AB testing only helps if you do it right and actually track the results. Otherwise, you’re just adding busywork.
Step 1: Get clear on what you want to test
Don’t try to test everything at once. AB testing works best when you isolate one variable—otherwise, you won’t know what made the difference.
Common things worth testing: - Subject line - Email opening line or hook - Call to action (CTA) - Length or tone of the email - Personalization (e.g., “Hi, John” vs. “Hey there”)
What to skip: Don’t bother testing background colors, fancy formatting, or your email signature. Nobody’s replying because of your signature.
Pro tip: If you’re new to this, start with the subject line. It’s the first (and sometimes only) thing people see.
Step 2: Set up your AB test in Hothawk
Assuming you already have a Hothawk account with outbound email campaigns ready, here’s how to set up an AB test that’s not a total mess:
2.1. Create your variants
- In your campaign dashboard, look for the option to “Add Variant” or similar. Hothawk’s UI changes from time to time, but it’s usually near the email editor.
- Write Version A (your control) and Version B (your challenger). Only change one thing between them—otherwise, the results are noise.
- Name your variants clearly. Don’t settle for “Email 1” and “Email 2.” Go with “Subject: Quick Question” vs. “Subject: Thought You’d Find This Useful.”
2.2. Choose your audience split
- Decide how much of your audience sees each version. The default is usually a 50/50 split, which is fine for most cases.
- If you have a huge list, you can get away with smaller test groups, but if you’re dealing with a few hundred contacts, just split it evenly.
- Hothawk will randomize who gets which email—don’t try to handpick segments unless you have a specific reason.
2.3. Set up tracking
- Make sure open, click, and reply tracking are enabled. This should be on by default, but double-check. Otherwise, you’ll have nothing to measure.
- If you’re testing replies, set up a dedicated inbox or use tags in Hothawk to track responses. Clicks are easier—just use tracked links.
- Don’t obsess over open rates alone. Thanks to email privacy features (thanks, Apple), open rates aren’t always reliable. Replies and clicks are better success metrics for outbound.
Heads up: If you’re sending to a small list, results will be noisy. Don’t declare a “winner” after 10 sends.
Step 3: Launch your test and let it run
- Send your campaign as usual. Hothawk will handle the variant distribution automatically.
- Don’t peek at the results too early. Let enough emails go out before making any calls. For small lists, aim for at least 50–100 sends per variant if you can.
- Resist the urge to “help” a variant win by resending or tweaking mid-test. That just muddies the data.
What to ignore: Don’t get distracted by individual outliers (“Wow, this one person replied to Version B!”). You’re looking for trends, not anecdotes.
Step 4: Track and analyze results
- After your test has run (a few days to a week, depending on send volume), head to the Analytics or Reporting section in Hothawk.
- Compare the key stats for each variant:
- Open rate: Useful, but take with a grain of salt.
- Click rate: Good for CTAs with links.
-
Reply rate: The gold standard for outbound. If you get replies, you’re doing something right.
-
Hothawk will usually show you the winner, but poke at the raw numbers yourself. If Version B got 2 more clicks on a sample of 200, that’s not a slam dunk.
How to know if it “worked”
- Look for meaningful differences, not just tiny changes.
- If the winner is obvious (e.g., Version A gets double the replies), that’s your new default.
- If it’s a toss-up, don’t overthink it—try a new test with a bigger difference next time.
Pro tip: Document your results somewhere outside of Hothawk, too. A simple spreadsheet is fine. You’ll thank yourself later when you want to see what you’ve already tried.
Step 5: Roll out your winner (and keep testing)
- Once you have a clear winner, make it your standard for future sends.
- Don’t stop after one test. The best teams run small, ongoing tests—not “one big experiment” and then call it done.
- Over time, patterns will jump out. Maybe your audience loves blunt subject lines, or your best CTAs are always under seven words.
What not to do: Don’t run endless tests just for the sake of it. If you find a version that clearly outperforms the rest, use it until results dip.
Honest take: What works, what doesn’t, what to avoid
- Works: Subject line tests, short vs. long copy, different CTAs.
- Doesn’t work: Overcomplicating the test. If you’re changing three things at once, you’ll never know what mattered.
- To avoid: Chasing vanity metrics. Open rates are nice, but if nobody replies or clicks, who cares? Focus on real engagement.
Pro tips for better AB tests in Hothawk
- Keep your tests simple. One change at a time.
- Don’t chase statistical significance unless you’re sending thousands. For most outbound, a clear pattern is enough.
- Check deliverability. If one variant tanks, make sure it’s not getting caught in spam.
- Ignore the hype. There’s no “magic subject line” that works for everyone. Test with your actual audience.
Wrapping up: Start simple, don’t overthink it
AB testing in outbound email isn’t rocket science, but it’s easy to overcomplicate. Pick one thing to test, set it up in Hothawk, and pay attention to actual replies and clicks. Keep a record, repeat what works, and don’t let perfect be the enemy of done. The more you test, the better your emails will get—without needing a PhD in statistics.