Many of my potential website critique clients ask me about “A/B testing” – more specifically, if they should be doing that instead of getting a critique. 90% of the time my answer is no – a self-serving answer, I suppose, but let’s dig into it a bit.
First off, what IS A/B testing anyway?
Put simply, A/B testing (pronounced “ay bee testing”) is just testing two different website setups to see which performs better. It’s also sometimes called “split testing,” because you use software tools to divide your web traffic in half; one half sees version A of your site, and the other half sees version B of the site, and you track the performance to figure out what works better.
A simple example, but common: you want to see what works better at getting people to buy, an orange button or a blue button.
Generally, A/B tests should be very specific, specific as possible, so in this case my example the color is the only change, and perhaps in future tests you try changing the text, or adding a small “secure payment” image next to the button.
So for example’s sake, let’s say you find the orange button gets 1% more signups. So you go with the orange, and then you move on to A/B test other things, like maybe your page headlines, or even some sites A/B test pricing options. It’s all up for grabs when you’ve got software that can run the permutations.
The Big Problem with A/B Testing
Ok, A/B testing has a lot of problems, most namely:
- It is VERY time consuming and requires a lot of planning and tracking to do it properly.
- It can become quite expensive, so if you are not in a commodity business it starts to seem like a wasted effort, especially in businesses where things can change year-to-year.
However, my big problem with A/B testing is that it is very easy to end up optimizing the wrong thing.
Like in our above example, a client who spends a couple of thousand dollars testing buy buttons to increase their conversion rate 1% might have been better off working with me on the organizational structure and language of their site, which could have increased the number of people who manage to just get to that buy button by perhaps 10% or even 25%.
Everyone is different, so without seeing your current circumstance, I cannot give you specific advice, but in general, about 90% of small and medium sized businesses are not ready for A/B testing. You are not Amazon trying to sell toilet paper, where all we care about is if it’s not overpriced and will hold up to the job. You’ve got a special product, a one-of-a-kind service, a unique value proposition that needs properly explained and presented to a very thoughtfully selected clientele.
Next time someone tells you that you should start A/B testing, make sure that you’ve done your homework leading up to that test, so you don’t end up optimizing for the wrong thing.