Variant B’s conversion rate (1.14%) was 14% higher than variant A’s conversion rate (1%). You can be 95% confident that variant B will perform better than variant A.
A/B testing is a way of testing two or more variations of something to see which is more successful. It’s often applied to webpages, adverts or emails, where success is measured in the number of conversions each version generates. While it’s easy to have a quick, superficial look at the results of these tests, it’s worth taking the time to look a bit closer.
Considering the statistical significance of your results means you can be sure you’re drawing accurate and meaningful conclusions. A statistical significance calculator looks at whether your results are down to chance or whether they are of interest.
For surveys, SurveyMonkey will automatically calculate the statistical significance of your results. It’s all contained within the Analyze section of your survey.
When it comes to A/B testing something like a live webpage, although you can look at the number of visitors and conversions for each variant and use a run-of-the-mill calculator to figure things out, it’s not all that simple. There’s plenty of room for error. We’ve created the above calculator to make this much easier. Just pop in the number of visitors and conversions for each variant and click “calculate” to see whether the differences between the two are significant or meaningful.
The significance testing calculator also asks you to select either a one-sided hypothesis or a two-sided hypothesis. A one-sided hypothesis is one that looks at whether one option performed better than the other, but not vice versa. For instance, it might look at whether variant A performed better than variant B. It doesn’t consider whether variant B performed better than variant A.
Meanwhile, a two-sided hypothesis accounts for both possibilities, and it’s generally what we’d recommend for A/B testing.
Let’s imagine you run an e-commerce site that sells trainers. You’re redesigning your website and have a few options to choose from. Naturally, you want to find out which design performs better by looking at which sells more. An A/B test is perfect for this sort of thing.
At the end of the testing period, you can see the overall visitors and conversions for each. While option A looks impressive, with a large number of conversions overall, you want to be sure. (You’re putting a lot of money behind the website revamp after all!) So you use the calculator to check whether option A results in a markedly higher proportion of sales per visitor.
This is just one example. There are all sorts of different scenarios where A/B testing can be useful. And remember that “conversions” aren’t always actual sales. A conversion can be a customer completing any action that you want to track, like a signing up to a newsletter, or filling out a contact form.
Try sending a survey to your customers to find out what they’re looking for.