Email-Marketing Optimization — 7 Steps to Effective A/B Split Testing
A/B split testing is a great method for optimizing your email-marketing campaigns. And you may be surprised how easy implementing it is. Basically, it involves testing two versions of your email among a small group of your subscribers, determining which version gets the best results, and then using the winning version for the rest of your subscribers. You may also want to use this method to test elements of website landing pages.
You can use the information you obtain from A/B split testing to fine tune subsequent email campaigns, while continuing to test additional variables. In this way, you’ll achieve steady improvements in the results from your email-marketing campaigns, one test a time.
Here we provide seven steps essential for implementing an effective A/B split test to help improve your email-marketing return on investment:
- Choose what you want to test. There are countless variables within an email campaign that, with a seemingly small change, can make a big impact on your results. The following give you an idea of some email campaign variables you may want to test:
- Email subject lines
- From lines
- Calls to action (e.g., location, size, color, copy)
- Headlines
- Message layouts
- Design
- Images
- Offers and incentives
- Adding a video
- Fonts (type, colors, and sizes)
- Personalization
- Best weekdays to send
- Time tests (e.g., morning versus late afternoon)
- Choose what measure of success you’ll use to determine the winner. Success measurements for email campaigns include:
- Open rates
- Click-through rates
- Conversion rates
- Revenue.
When choosing this measure, be sure to consider what results you’re really interested in getting. Let’s say you’re testing subject lines, for example, and lead generation is the primary goal of your email campaign. One subject line may result in higher open rates, yet the other may actually produce more qualified leads.
- Develop a research question to test. This is a good way to help you stay focused on the outcome of your A/B split test. Some examples of research questions include:
- Which call to action produces the highest conversion rates?
- Which version provides a higher click-through rate — with a video or without a video?
- Which day will generate a higher open rate — Monday or Tuesday?
- Which headline results in the highest click-through rate?
- Be sure to test only one variable at a time. For accurate results, be sure you’re testing only one variable at a time. In other words, if you’re testing subject lines and you send version A to one group in the morning and version B to a group in the afternoon, you won’t know whether it’s the time of day or the subject line that had the biggest effect on the open rates.
- Randomly divide your list into two equal segments. Although it would be ideal if your two test groups had similar demographics, behavioral segmentation, income, age, gender, etc., you can still get valid results by simply dividing your list randomly.
- Be sure your sample size is large enough to make a statistically valid decision. How big should your sample sizes be? It all depends on how large your email list is. If you have a very large list, you may be able to use only 10% of your list for testing. But if you have a small list, you may have to use the entire list to get statistically valid results on an A/B split test. If statistics isn’t your thing, don’t spend too much time trying to figure it out. Leverage help from online tools, such as the Split Test A/B Test Marketing Calculator. Or contact the experts at FulcrumTech for help.
- Make A/B split testing faster and easier by using readily available testing tools. Many email service providers (ESPs) offer useful tools for running A/B split tests on emails and websites today. Some examples of what these tools can do include:
- Randomly select your two samples.
- Track results and generate a report with a side-by-side comparison of the results.
- Select the winning version and automatically send it to the rest of the sample.
If you have questions about finding and choosing the right tools to use, FulcrumTech can help in this area, too.
Email Marketing A/B Split Test Example
The following is a good example of a typical A/B split test for an email campaign. Let’s say you want to test the effectiveness of two different subject lines. So your research question would be:
Which subject line will deliver a better open rate — version A or B?
Then you would:
- Create a control email (Email #1) with version A of the subject line.
- Create a second test email (Email #2) with version B of the subject line.
- Send each email to an equally sized, random sample from your total list of 50,000 subscribers, on the same day and time.
Results:
- Email #1 — Subject line A
Sent to 5,000 subscribers
Open rate = 20% - Email #2 — Subject line B
Sent to 5,000 subscribers
Open rate = 23%
In this case, open rate is being used to measure success. When the results for the two open rates are analyzed for statistical significance, you determine the improvement obtained using subject line B versus subject line A is indeed statistically significant, with a 95% level of confidence. At that point, send the email with the winning subject line to the rest of your email-marketing list.
Need help in developing a more effective online marketing strategy for your organization? Email us or give us a call at 215-489-9336 and the email-marketing experts at FulcrumTech will get you started today.
Check out these useful online tools for A/B split testing of email campaigns: