What is A/B/n Testing?
You may already be familiar with A/B testing (split test) where two variants of a webpage or popup are tested against each other to see which gets the better conversion rate. In this test, visitors are randomly shown one of the two campaigns with a 50:50 split to determine which variant works best.
A/B/n testing is a more comprehensive extension of the test, where the "n" stands for number of variants. Usually, there are at least 3 variants of the campaign being tested with traffic equally divided between the three. At the end of the test, you would be able to see how many conversions each had and determine the best variant of these.
Why is A/B/N testing used and its importance?
A/B/n testing helps you gather useful insight into what works for your customers. Whether it is specific text, images, colors or different CTA, instead of guess work, you can find empirical evidence with an A/B/n test to see which brings in more engagement and conversions.
Not only that, with an A/B/n test, you can also find which campaign is working the worst although it had an equal share of the traffic. Using this data you can determine which content, CTAs and images don't work for your customers and improve your overall marketing strategy.
A/B/n Testing in On-site Messaging Popups
Creating your first A/B/n Test
The first part in creating a split A/B test, is to create a control campaign that would serve as the base variant. Once you have setup a campaign skeleton, you need to decide on the number of variants you require. We recommend starting with 2 and if required add more.
Steps to create your A/B/n test variants:
- Head to the All Campaigns section from the main menu.
- Click on the ellipsis (three dot menu) on the campaign which you want to A/B/n test.
- From the menu, click on A/B Test Campaign
- Two variants of the control campaign will be created.
- Click on the Edit button and customize both the variants.
- You can also set the percentage of traffic each of the variants get. Click on the percent option right next the variant name and change the split ratio between the created variants.
To create more than 2 variants, click on the ellipsis (three dot menu) on any of the variants, and choose Copy as new variant option. You will get the same Edit and percentage options which will now be distributed among 3 campaigns. Similarly you can create more variants.
A/B/n Test Results, Analysis and Comparison
There are three levels at which you can analyze the split test:
- Consolidated stats for the entire campaign where you can check View, CTR, Emails captured and other KPIs.
- Segregated results for each of the A/B/n variants. This will provide insight into why one variant worked better than others.
- Finally, you can check the comparative A/B/n test result where you will see the conversion comparison of all the variants. This gives you a top-level view of which variant worked the best with the share of the traffic it received.
Promoting a winner
Once the testing has finished, its time to select the winner. With the analysis of each of the variants you can determine the winner and promote it to be the main campaign.
To set a variant as the winner, click on the ellipsis (three dot menu) in the variant row and click on Promote as Winner. Once done, this variant will become the primary campaign and will receive 100% of the traffic.
Pitfalls to avoid in A/B/n Testing
There are three things to avoid while building your A/B testing strategy:
- Keep the number of variants as low as possible to avoid diluting your traffic.
- Try to change only 1 variable at a time when A/B testing. Changing multiple variables, at the same time, within variants may not provide a significant result.
- Don't stop testing. While a certain variation may have worked for one campaign, it may not be as significant for another. Therefore, you should keep testing and refining the results for future campaigns.