Features  |  Pricing  |  Solution  |  Why Octane  |  Resources  |  Digital 2019  |  Company  |  Contact Us     No Spam

A simple explanation of A/B Split testing and why you should do it

So you want to do an email campaign and you’re excited to get started, but you are stuck deciding on which subject line you think will get you the highest open rates or what day of the week to send the campaign out on. A simple way to truly know what works for your business and subscribers is to do A/B split testing.

So what is A/B split testing?

It’s actually very much like it sounds, you send out multiple options or versions of your campaign to multiple but distinct groups of your customers to see which version gets the best results (such as opens, click through rate, conversions or shares).  For example, let’s say you have a list of 50,000 subscribers and you are deciding if it is better to send the email on Monday or Friday. Well you could create two separate campaigns and send version A to 25,000 of your subscribers on Monday and version B to the other 25,000 on Friday.

This way, you can then take a step back and analyze which campaign was better received by your subscribers. By comparing the differences between the two campaigns, you can then identify what worked and what didn’t. These insights can be implemented in future campaigns (with minor variations when necessary) so that you can consistently get good results.

Image 1: Representation of A/B mailer testing. As you can see, the mailer sent on Monday performed better with higher open rates than the one sent on Friday.

Another way of doing A/B testing is to use only a sub-set of your subscriber list. For example, if you have a list of 50,000 recipients, you mark 2,000 of them for testing. You can then send campaign A to 1,000 of them and campaign B to the other 1,000. Then you analyze which was more successful and send that campaign to the remaining 48,000 to get the best results.

Different kinds of A/B split tests

The number of A/B split test options are only limited by your imagination. The most common forms of A/B split testing would be associated with timing. Timing is about when (time, day, date) you want to dispatch your campaign. Another common form of testing would be with the format or design of the mailer. This can be in terms of the text to image ratio, content placement, call to actions, colours etc. Here, we would like to mention that lesser number of images is better for your mailers (due to default image blocking in email clients) but A/B split testing is a great way of finding the right ratio that works for you.

Another one of the more important but often over looked tests is in relation to the subject line. This is so because the subject line is one of the single most important factors that determine if the reader will initially open your mail or not. We’ve seen that a great subject line goes a long way in positively impacting a campaign’s open rates. Similarly, other A/B test options can be font size variations, placement and size of ‘call to actions’, colour schemes, recipient location or demographics, subscriber behavior and history and so on and so forth. The number of ways you can segment your data will provide you with options for these kinds of test campaigns.

If you are looking to truly optimize the effectiveness of your mailers, then A/B testing is something that you should definitely be doing. By knowing which format and elements work the best, you will be aiming for and unlocking greater engagement from your customers. It all comes down to better understanding what your customers like and how you can tailor each of your campaigns to better suit their preferences and requirements leading to more productive conversations.


-Team Octane


Tags: , , , , , , , , , , , , , , , , , ,