How a Preview Image Increased a Landing Page's Conversion Rate by 359%

A recent A/B test I ran examined how a preview image on a landing page affects the page’s conversion rate. 

The results of the test were conclusive – a landing page that includes a preview image of the offer had a 359 percent higher conversion rate than a landing page without the preview image.

Google’s new “Google Experiments” for A/B testing was used for this test (read more on setting up Google Experiments here).

Here’s an outline of the steps followed to carry out the A/B test.

Following the Rules

My test followed the rules prescribed in "8 Rules of A/B Testing – The Art in Marketing Science".

  • Hypothesis: A landing page with an image will have a higher conversion rate than a landing page with no image.
  • Tested Variable: The existence of a preview image.
  • Success Metric: Conversion rate as measured by the number of form submissions over the number of visits to each page.
  • Volume and Statistical Significance: My test ran for two weeks and accumulated 680 visits to the two pages.
  • Test Group and Split: Since the landing page without the preview image existed before the test, it was used as the Control group and ran an uneven test (70/30 split) favoring the control group.
  • Randomization: Google Experiments was used to run the test and make sure that the traffic driven to these pages was randomly distributed between the two test pages.

7 Steps to Carrying Out a Landing Page A/B Test

Here's how the test was carried out, and tips on how you can carry out something similar. The champion was crowned based on the conversion rates measured on the pages.

1. Create the Variations

test-variation-with-preview-image

When creating the test variations you can “clone” the first variation and then change the one variable you want to test. In this test, the variation was created with the preview image (the variable) and then cloned the page and simply deleted the image.

2. Set up Google Experiment

create-new-google-experiment

Next, follow the steps outlined in Google Experiments (now part of your Google Analytics account under the Content menu). The main step in this process is to make sure that the pages are set up and working and to determine the traffic split.

3. Put the Measurements in Place

If you're using Google Experiments to also measure the results, you will need to set up Goals and use those as the success metric. In this test, since I measured conversion rate, I used Optify to determine the winner.

4. “Test the Test”

Before you start driving traffic to the landing pages, make sure the content swap is working and your test is set up correctly.

5. Drive Traffic to Your Pages

Using Google Experiments you don’t need to worry about where to send the traffic, Google will take care of the routing. Simply build your campaign to drive as much traffic as you can to the core page and let the software do the rest.

google-build-campaign-drive-traffic

6. Monitor Performance and Decide When to Stop

Over the course of the test you will need to monitor the performance of your variations to make sure neither of them are underperforming, causing them to hurt your users’ experience and damage your brand equity. Use this Excel template to know when your results are statistically significant so you can stop the test and declare the champion.

7. Declare the Winner

At the end of the test you'll need to declare a “champion,” a variation that performed better based on the success metric you’ve set up at the beginning of the test.

A common mistake with A/B testing is to change the success metric after you see the results to fit your predisposed biases. Using Google Experiments prevents you from doing so because you have to set the success metrics before you begin the test.

With that said, there will be other key performance indicators (KPIs) collected during the test that can still be valuable, so make sure you don’t ignore them.

landing-pages-declare-winner

The Treatment Page, the landing page with the preview image, had a conversion rate of 24.8 percent while the Control Page, the landing page without the preview image, had a conversion rate of 5.4 percent. Clearly, the Treatment Page performed better, but is it safe to declare it as the champion?

Statistical Significance

The fact that one variation showed better performance on your success metric than the other variation doesn’t make it automatically the champion variation. You need to make sure that the results are statistically significant, rather than just by chance.

You can use Avinash Kaushik’s Excel template to verify your results are statistically significant. I plugged in my test numbers and determined the results are statistically significant.

verify-results-statistically-significant

We Have a Champion … Now What?

It’s now clear that the treatment page performed much better and proved my original hypothesis that a landing page with a preview image will have a higher conversion rate. But now what?

First, end the test and stop serving the underperforming page. Once you have a winner, you should apply what you learned from the test to other, similar places on your website. In my case, I should make sure that from now on I include a preview image on all my landing pages.

Next, document your findings and make sure that they are being recorded and implemented in your organization. When your successor takes over, they will be able to build on the expertise you gained rather than having to start all over again.

Finally, you should set up your next test. Testing is a core element of every marketing activity. When one test ends, another one begins.