One often-ignored aspect of landing page testing is coherency. Coherency is an overall sense of your design "hanging together." It's a congruity and harmonious consistency in the relation of all landing page parts to the whole.
It's clear to most Internet surfers within a split second of clicking on a link whether the destination page has coherency. It's also clear when coherency is lacking. Visitors respond to incoherent pages with a variety of gut reactions, and none of them are flattering. In the extreme, such pages can be experienced as tacky, cheesy, bewildering, or obnoxious. Unfortunately, you've probably seen hundreds of examples before.
Low-coherency landing pages affect visitors on an emotional level, and no amount of logic will convince them to linger on them. And that's a pity, especially when the content is relevant to their needs. But they can't get past the cognitive dissonance of the inconsistent presentation to even focus on the intended message.
Incoherent and unprofessional pages also give most consumers a low confidence in the product. The visual design of the page is particularly important. This includes consistent color palettes; professional graphics in the same visual style; consistent font sizes, colors, and families; and the amount and layout of the writing.
Coherency is an emergent property of the unified whole. All of the supporting elements that contribute to it must work together. Coherency-related elements should be grouped into a single unified look-and-feel that governs the visual experience of your landing pages.
The need for high coherency is an excellent reason to consider whole-page redesigns (especially in low data rate environments). This allows you to fix all known visual problems in one shot. Often, the details of good coherence have already been formally codified in your company's visual design brief document. Much of the actual implementation can be encapsulated in Cascading Style Sheets.
Conversely, if fine-granularity test elements (including individual graphics and buttons) are used, the tester can unwittingly decrease the coherency of the landing page. This happens in two primary ways: mixing, and unexpected juxtaposition.
Let's assume all the elements you decide to test on your original page have significant problems. You spend considerable time writing your test plan document and coming up with better alternatives for each of the original elements, and you succeed. Each of the new elements is in isolation better than its original counterpart. In fact, when they're all collected together in a coherent new whole, they become even more synergistic and powerful.
Unfortunately, if you're running a typical multivariate test, the new elements will be mixed and matched at random with other elements that were part of your original design. When this kind of mixing occurs, the new elements may actually suffer by their combination with poor-quality original elements. They'll be judged by the company they keep, not on their own merits.
In such cases, the new elements will look worse than they really are. They may even seem worse than their original counterparts due to the fact that the mixing produced a wider range of quality differences on the page. In other words, the original design elements may have been mediocre, but they were all roughly equally mediocre. The introduction of a new element into this mix actually brought the quality difference into even sharper contrast, thus making the overall design seem worse.
Unexpected juxtaposition can be another source of decreased coherency. Since all elements are shuffled randomly in the testing process, it's critical to consider in advance how nearby combinations of elements may appear, and to anticipate potential problems.
For example, let's assume you have a landing page that includes a call-to-action button, long descriptive text, and a second call-to-action button with different text. Your test plan contains alternatives for the last two elements.
In your brainstorming, you decide that since most people won't read the long descriptive text, the alternative is to remove it altogether (a reasonable variant to test). You also figure that having a consistent call to action is important for the strength of your messaging, and decide to test a copy of the first call to action as an alternative to the second one (also a reasonable course of action in isolation).
However, based on this test plan, a quarter of your audience will see a version of the page that includes two back-to-back copies of the first call-to-action button (without any intervening text). Most visitors will think that your landing page is broken, or that your Web designer accidentally inserted an extra copy of the button. In either case, they'll probably have a negative reaction to it.
The removal of the descriptive text and the repetition of the first call to action in the second one will both be penalized. This example was fairly obvious, but this type of test design mistake is common among inexperienced testers.
When you're writing your test plan and deciding what to test, don't ignore the strong potential impact of coherency. Remember, the test isn't just about the individual page elements, but also how well they play together.
At SES London (9-11 Feb) you'll get an overview of the latest tools, tips, and tactics in Paid, Owned, Earned, Integrated Media and Business Intelligence to streamline your marketing campaigns in 2015. Register by 31 October to take advantage of Early Bird Rates.