A/B Testing
A/B testing lets you send two variants of a campaign to a small portion of your audience, measure which performs better, and then automatically (or manually) send the winning variant to the rest. Use it to systematically improve your open and click rates over time.
Overview
EmailSendX A/B testing works by splitting your audience into groups. Two groups (A and B) each receive a different variant of the campaign. After a test period, the variant with better performance wins and is sent to the remaining audience.
What you can test:
- Subject lines — the most common and high-impact test. Different subject lines can have dramatically different open rates.
- From names — test sending from "Alex at Acme" vs "Acme Corp" vs just "Alex". Sender recognition affects open rates especially in crowded inboxes.
- Send times — test Tuesday morning vs Thursday afternoon. Audience engagement varies by day and time.
Statistical significance requires volume
Setting Up an A/B Test
When creating a new campaign, toggle Enable A/B Test on the campaign setup screen (before choosing your editor).
- Choose what to test: select Subject Line, From Name, or Send Time.
- Set the test split percentage: the percentage of your total audience that will be split between the two variants. Common choices:
- 20% test (10% A + 10% B), 80% gets the winner — good for large lists
- 40% test (20% A + 20% B), 60% gets the winner — better statistical confidence
- 50/50 — no winner send; all results are used for learning only
- Set the test duration: how long to wait before evaluating results and sending the winner. Range: 1 hour to 7 days. A 4–24 hour window is typical for open rate testing.
- Choose winner metric: Open Rate (default, used for subject line tests) or Click Rate (better for testing content variations, though that's not a built-in A/B type yet).
Configuring Variants
After enabling A/B and completing setup, the campaign editor shows variant tabs: Variant A and Variant B.
Subject line test
Each variant tab has its own subject line and preview text field. The email body is identical for both variants — only the subject differs. Enter your two subject line ideas:
- Variant A: "Your April newsletter is here"
- Variant B: "3 things you missed this month"
From name test
Each variant specifies a different from name (and optionally a different from email address). The subject line and body are shared.
- Variant A from: Acme Corp
- Variant B from: Alex from Acme
Send time test
Both variants are identical in content, but each is scheduled at a different time. Configure the two send times on the scheduling screen. The variant with a higher open rate after the test period wins, and the winner's time is used for the remainder send.
Keep one variable constant
Winner Selection
After the test period ends, EmailSendX evaluates the two variants and determines a winner.
Automatic winner selection
When automatic selection is enabled, EmailSendX compares the winning metric (open rate or click rate) between variants at the end of the test period. The variant with the higher metric is sent to the remaining audience immediately — no action needed from you.
If the variants are within 1% of each other (effectively a tie), EmailSendX sends Variant A to the remainder to ensure the campaign delivers on time.
Manual winner selection
When manual selection is enabled, EmailSendX sends you an email notification when the test period ends, showing the results for both variants. You review the data and click Send Variant A or Send Variant B to trigger the winner send. The remainder holds until you make a selection.
The remainder will wait up to 7 days for your selection. After 7 days without action, EmailSendX automatically sends the leading variant.
Don't let it expire
Reading A/B Test Results
Full A/B test results are in the campaign's analytics page. Switch between variant views or see a side-by-side comparison.
| Metric | Variant A | Variant B |
|---|---|---|
| Recipients | 500 | 500 |
| Opens | 142 (28.4%) | 183 (36.6%) |
| Clicks | 31 (6.2%) | 40 (8.0%) |
| Unsubscribes | 4 (0.8%) | 2 (0.4%) |
| Winner | — | Yes (higher open rate) |
After the winner is sent, the analytics page shows a combined view of all three sends (Variant A, Variant B, and the winner send to the remainder) with a total campaign performance summary.
Building a learning log
A/B testing compounds in value over time. Consider maintaining a spreadsheet or note with the results of every A/B test:
- What you tested
- The two variants
- Open/click rates for each
- Winner and margin
- Conclusion and what to apply to future campaigns
Over a dozen or more tests, clear patterns emerge about what resonates with your specific audience.
Start optimizing your campaigns
A/B test your subject lines, from names, and send times to systematically improve open rates and click rates over time.