Why Run A/A Tests?
A/B testing is an immensely valuable process for making data-driven decisions about everything from web pages to feature releases. A hunch that your conversion rate optimization could be improved by making the CTA button larger is all well and good, but if you’ve split your userbase into two groups and the one that saw the larger button made 5% more conversions, that’s a very different (and much better) thing. But an A/B test can be a complicated process. How can you tell that your testing process is operating properly? This is where A/A tests come in. By running two identical features through your A/B testing software or other process, you can ensure that the testing tool works as expected. With an A/A test, you can answer these questions:- Are users split according to the percentages you planned?
- Does the data generally look how you expect it to?
- Are you seeing results with no statistical significance 95% (or whatever your confidence level is) of the time?