A/B testing and/or multivariate testing is one of the best ways to improve results for your Web site. Only a small percentage of Web site owners invest in this effort, yet there are few things that offer a bigger return on investment. Sometimes the things that matter to users and increases conversions will surprise you.
For example, usability guru Jakob Neilsen once told me when you highlight a portion of your Web page by putting it in a box a different color, perhaps with some fancy image in it, users are less likely to look at it because they assume it's an ad. It's an extension of the concept of banner blindness. When users aren't thinking about making an immediate purchase, their reaction to ads is "danger, danger, ad alert, abandon all hope ye who enter here."
Many, many Web site owners use visual emphasis elements to draw the user's eye to the place on the page they want to emphasize. The efforts to do that may be having exactly the opposite effect. This isn't the kind of thing you can know for sure based on your gut instinct, even if you have years of experience in Web site design.
But there's an answer. You test it. You can try a simple A/B test, where one version of your key content is completely blended into the rest of the background of your site, and the other version uses color, styling, and/or images to highlight it. A true A/B test alternates the element it shows to users and is set up so you can track the results.
This means the odd-numbered visitors (first, third, fifth, and so forth) see scenario A. The even-numbered visitors see scenario B. Then you need to mark these scenarios somehow so your analytics package can tabulate data on what happens with the campaign. Once you've run the campaign long enough to be statistically significant, you can then see which one performed better.
Here you need to be careful to look beyond the immediate surface. For example, if scenario A produces fewer clickthroughs than scenario B, but produces more conversions than scenario B, which scenario is better?
Another key question: What if the user converts on the next visit to your site, instead of the current one? Did you set yourself up to measure that as well? This latter question becomes important to sites where users are likely to require multiple visits to buy.
Multivariate testing is another type of A/B testing. As you might expect, this involves testing more than two scenarios at once. One of the big issues you need to manage with multivariate scenarios is making sure you have enough data for the test to be statistically significant.
To quickly understand multivariate testing, if there is one variable with two states, A or B, there are two possible outcomes: A or B, which is 2 to the 1st power. If there are two variables with two states, again A or B, then there are four outcomes: AA, AB, BA, and BB, which is 2 to the 2nd power. So, if you're going to test eight variables at once, that's 2 to the 8th power, and leads to 256 possible scenarios.
There is also a version of multivariate testing that will take eight variables and then use a combination of the variables resulting in less than 256 total scenarios. Assuming the variables don't interact, you can take this smaller set of scenarios and extrapolate to determine the best combination of the variables.
There's a version of multivariate testing that will take eight variables and then use a combination of the variables resulting in less than 256 total scenarios. Assuming the variables don't interact, you can take this smaller set of scenarios and extrapolate to determine the best combination of the variables.
This can all be done through outsourced services. The problem: Variables do interact. It may be a mistake to use that assumption in setting up your test, so care is needed when putting that approach into play.
Ultimately the best way for you to approach testing will have a lot to do with your Web site, your market place, and the nature of the people coming to your Web site.
The bottom line: Test all the time. Your best guess for a Web site design is still just a guess. You don't -- and can't -- know what your visitors will respond to more favorably. Testing provides the answer.