Ten Tips for More Effective A/B Testing

 How to maximise the success rates of your A/B Testing.


Thanks to tools like Visual Website Optimizer and Optimizely, setting up an A/B test is relatively easy these days. However creating an A/B test that delivers great results can be a trickier proposition.

A/B testing doesn’t need to be trial and error. A scattergun approach is likely to result in frustrating tests that show no significant improvements over time.

Here are ten tips on how to make your A/B testing more effective:
  1. Have a hypothesis before you start - A/B testing should always be based on a solid hypothesis.

    There’s no point in doing random A/B testing without knowing what problem you are trying to solve. Don’t waste your time constantly changing the colour of your call to action button if the problem lies elsewhere.

    Look at your web analytics, or better yet speak to your customers to find out what problems they might be facing when using your website. Try to focus on one area that you want to improve. A more strategic A/B test is far more likely to yield positive results.
  2. Calculate your sample size before you start - Sample size will have a profound impact on the effectiveness of your test. Don’t put yourself in a position where you are running an A/B test for months on end because there aren’t enough visitors to find a statistically proven winner.

    Look at your website data before you start and identify the key touch points that visitors make on the path to conversion. Start with these pages and work your way out.
  3. Give it time - An A/B test is only as good as the sample size it has to work with. Some tests will require patience on your part. If one variation takes an early lead resist the urge to assume that it will go on to be declared a winner.

    Don’t take statistically significant improvements as gospel. I’ve seen plenty of instances where results have changed drastically even after an A/B testing tool has declared one variation a winner.

    AB Test Reports


    If you were to look at the example above after the first week, you might be tempted to believe that the variation tracked in green was going to be the winning variation. However, the test results over a longer period shows that actually there was very little to differentiate the two variations.
  4. Only test what you can change - Tools like Visual Website Optimizer and Optimizely are great for making quick visual changes to a page design thanks to their WYSIWYG interfaces. However, a lot of the time putting those changes in place on your website can be a lot more complicated and require extensive development efforts.

    Always make sure what you test can actually be implemented. Don’t waste your time coming up with an amazing design only to find out that it will never go live even if it shows an improvement.
  5. Drop the ego - Don’t go into a test assuming that your variation is obviously better and that it’s only a matter of time until it’s declared the winner – your users might not agree with you.

    Try to detach yourself from your design once the test is running. It’s all too easy to jump to conclusions early on when you see that your variation is winning. Don’t damage the integrity of the test by being possessive.
  6. Think about more than looks - A/B testing shouldn’t be focused on finding the design that is the prettiest. Some of the biggest improvements can come from making changes to your headlines and web copy. Not only will this be an easy test to set-up, but it will also result in improvements that can be made quickly after the test finishes.
  7. Think about marginal gains - At the 2004 Olympics the British cycling team earned a modest haul of four medals – ­two gold, one silver, one bronze. Four years later at the 2008 Olympics, Team GB dominated the sport taking home an impressive 14 medals – eight gold, four silver, two bronze.

    What was the secret behind this massive improvement? The aggregation of marginal gains.

    The man behind the success was Dave Brailsford, the team Performance Director who revolutionised the way team trained and prepared for competitions.

    “The whole principle came from the idea that if you broke down everything you could think of that goes into riding a bike, and then improved it by 1%, you will get a significant increase when you put them all together” – Dave Brailsford (2012)

    When added together, all those small improvements lead to big improvements on the track.

    ‘Aggregation of marginal gains’ is not a new idea, but it’s a philosophy that you should think about adopting when approaching A/B testing and conversion rate optimisation. Making slight improvements to the many small factors that influence your conversion rates will lead to big gains when added together. This could be anything from removing distracting social share buttons, removing a form during the checkout process, or reducing the number of outbound links on a page.
  8. Don’t expect big results from small changes - Small tweaks to your design will generally see small returns.

    That said, starting small and working your way towards bigger changes is not a bad way to work. There could be certain aspects of your design that are working with your users that might be lost with a complete overhaul. Without testing the small things first, you may never know what users liked in the first place.
  9. Don’t go too big - The main issue with qualitative testing methods like A/B testing is that you will often be in the dark as to why users preferred one variation over another. If you are making sweeping changes to your design, you can often lose perspective of the ‘why’ even after a winning variation has been found.

    If you are making big changes to your design then it’s probably worth considering multivariate testing instead of A/B testing as it will give you a better understanding of which aspects of a new design worked and which didn’t.
  10. Manage expectations - Not every test is going to show massive improvements, so you need to be realistic in managing the expectations of the stakeholders involved.

    It’s worth remembering that conversion rate improvements are completely relative.

    Make a 1% improvement for a website like Amazon and we are talking an annual revenue increase in the hundreds of millions. However a 1% improvement for website with only a few hundred visitors a month probably won’t make a huge difference to their bottom line.

Conclusion

Making mistakes is fine when it comes to A/B testing. There is still a lot you can learn from a failed hypothesis. A/B testing is not about massaging your own ego or proving you were right, it’s about finding a design and experience that works best for your users.

It’s important that you learn from your mistakes and by following some of these tips you should hopefully start to see more significant results from your tests.

Contact us for more information about our A/B testing and conversion optimisation services.

SHARE THIS PAGE!

more reading...