Why Your A/B Tests Fail
Today I wrote a blog on ConversionXL about what we learned from A/B tests performed on the A/B testing suite Convert Experiments. Here a short quote from that article.
It’s only after collecting and analyzing as much research as possible, and doing some basic hypothesis testing with wireframes, do the agencies get into the actual A/B testing process.
Now, you might not believe it, but there’s a fairly common issue with A/B testing tools causing a “blink” to occur, as the tool decides which variation to show the visitor, which can really skew test results.
This is mostly attributed to slow site speed, or poor test set up, but it’s an important thing to consider. We have seen that these blinks can lower conversions up to 18%.
Much of this has to do with the how the tool serves the variation (is the variation client side or server side?) Certain tools are also vulnerable, allowing competitors to peek at your experiments using this script. Some A/B testing tools that I’m sure don’t have this blink problem are Sitespect, Google Analytics Content Experiments, and Convert.com (ours).
Enjoy the read of the full article on ConversionXL, here.