How to A B test email content and analyze results in Braze

So, you want to run an A/B test on your emails in Braze. Maybe you’re trying to bump up your open rates, or maybe you just want proof that that “funny” subject line works better. Either way, you want real answers, not just pretty charts. This guide is for the folks who care about what actually moves the needle, not just checking the “we test things” box.

We’ll walk through how to set up an A/B test in Braze, what to watch out for, and how to actually make sense of your results. No hype. No vague advice. Just the steps, the honest pitfalls, and some keep-it-simple advice.


Step 1: Start with a Real Hypothesis

Before you even log into Braze, stop and ask: What are you actually trying to learn? Too many A/B tests are just “let’s see what happens if…” That’s fine for learning, but if you want meaningful results, get specific.

A good hypothesis might look like:

  • “A shorter subject line will increase open rates by at least 2%”
  • “Adding a CTA button above the fold will get more clicks than one at the bottom”

Write down your hypothesis. If you can’t explain why you’re testing something, you probably don’t need to test it.

Pro Tip: Don’t test tiny changes unless you have a massive list. Subtle tweaks are usually swallowed by noise.


Step 2: Set Up Your Email Campaign in Braze

Now that you know what you want to test, log in and set up your campaign. Braze calls these “Campaigns” (not “Blasts” or “Journeys” — those mean something else in their world).

Creating an A/B Test in Braze

  1. Start a New Campaign
  2. Go to “Campaigns” and click “Create Campaign.”
  3. Choose “Email” as your channel.
  4. Define Your Audience
  5. Pick your segment. Be realistic: if your test list is too small, you won’t get useful data.
  6. Avoid overlapping segments if you run several tests at once—otherwise, your results will get muddy fast.
  7. Select “A/B Test”
  8. On the message composer screen, you’ll see an option to add variants (A, B, C… up to 8).
  9. Click “Add Variant” to create your test versions.
  10. Build Your Content Variants
  11. Enter your subject lines and body content for each version.
  12. Make sure the only major difference between A and B is the thing you’re testing. Otherwise, you won’t know what caused any change.
  13. Set Split Percentages
  14. Decide what percent of your audience gets each version.
  15. If you want to run a “winner takes all” follow-up, reserve a chunk (say, 20%) for the winner after the initial test.

What NOT to do: Don’t test more than one thing at a time per variant. Don’t run a test with 50 contacts and expect statistically significant results. And don’t let marketing FOMO push you into over-complicating it.


Step 3: Choose Your Winning Criteria

Braze lets you pick how it decides a “winner.” This gets overlooked, but it matters.

  • Open Rate: Good for subject line tests.
  • Click Rate: Better if you care about what people do after opening.
  • Conversion: If you have events set up (like purchases or signups), this is the gold standard.

Set your “Winning Metric” and “Evaluation Period.” Don’t make the period too short—give people time to open and interact, especially if you send on weekends or odd hours.

Honest Take: Conversion is what matters for most businesses. Opens are nice, but Apple Mail privacy changes and others can make them unreliable.


Step 4: Schedule and Launch Your Test

  • Triple-check your variants for typos and broken links—nothing tanks credibility like a “Hi {{First Name}}” fail.
  • Schedule your send for when your audience is actually awake and checking email.
  • Hit “Launch.”

Once it’s out in the wild, resist the urge to peek and call a winner after 15 minutes. Real results take at least a day or two, sometimes longer.


Step 5: Analyze Results Without Fooling Yourself

This is where most A/B tests go sideways. Braze’s reporting is decent, but you need to apply some common sense.

Where to Find Your Results

  • Go to your campaign dashboard.
  • Look for the “Performance” tab.
  • Compare your variants on the metric you actually care about (opens, clicks, conversions).

But don’t just eyeball the numbers.

Things That Trip People Up

  • Small Sample Size: If you’re testing on a tiny group, a few lucky opens can make a dud look like a winner. For most email tests, you want at least a few hundred people per variant.
  • Statistical Significance: Braze will sometimes say you have a “winner” early—but if your numbers are close, it’s probably just noise.
  • Time Zones & Send Timing: If you send at 4pm in New York, folks in California might not see it until later. Be consistent or segment by geography.

A Simple Way to Gut-Check Your Results

Don’t obsess over p-values unless you’re running huge lists. Instead:

  • Look for big differences (5%+ change in open/click/conversion rates).
  • If the “winner” only edges out the other by a fraction of a percent, call it a tie and try a bolder test next time.

Pro Tip: If your result feels “meh,” you’re not alone. Most A/B tests show little difference. That’s normal. Don’t chase tiny wins.


Step 6: Act on What You Learned

  • If you have a clear winner, use that version for your next full send.
  • Archive or document your results somewhere—not just in Braze, but in whatever your team uses to track learnings.
  • Share what you learned, even if it’s “no difference.” That’s valuable too.

What to Ignore: People will want to read too much into one test. Don’t let one result drive all your strategy. Run a few tests, look for consistent patterns, and don’t be afraid to repeat or tweak experiments.


Step 7: Keep It Simple, Rinse, Repeat

A/B testing is a tool, not magic. If you’re running dozens of tests with tiny differences and small lists, you’re burning time for little payoff. Focus on big changes, clear hypotheses, and enough volume to get real answers.

You don’t need to reinvent your email strategy every week. Use A/B testing in Braze to learn what actually works for your audience, keep what’s effective, and move on. Most of all, don’t get lost in the weeds. Test, learn, repeat—and keep your eyes on what matters: sending better emails, not just running more tests.