The Secret to A/B Testing for Customer Experience

June 18, 2015
Contributor: Heather Pemberton Levy

Use these three approaches to make the most of A/B and multivariate testing to deliver better experiences across key digital platforms.

The matrix of options open to a marketer for improving the online customer experience is truly bewildering – devices, content, navigation, position and even color variations can all impact task completion and visitor behavior. Often, expectations from A/B and multivariate tests are unrealistic, inspired by spurious anecdotes.

Marketers using A/B testing often use an inconsistent process, leading to minimal improvement, according to Martin Kihn, research director for Gartner for Marketing Leaders, and Magnus Revang, research director at Gartner. Improve your success by using three methods to develop an optimal testing program.

Use A/B or multivariate?

First, note that A/B and multivariate tests are not the same thing though they are often used synonymously. A/B testing simply compares A versus B versions of a customer experience using two different sets of users. For example, you might show 50% of site visitors a red Buy Now button and the other 50% a green Buy Now button. By comparing a desired user action between the two one can be declared a winner. This is the “Champion and Challenger” model of testing.

Multivariate testing involves testing multiple versions of multiple elements at the same time using matrix algebra to determine the winning combination. Both models use statistical analysis to determine validity of the test.

A number of testing tools allow marketers to change content and design on the front end without needing to change code. A marketer can test everything from simple visuals or copy on calls to action such as “Check Availability” versus “Buy Now” to more involved tests such as the placement of navigation elements on a website.

Regardless of the test used, keep in mind these three important approaches when setting up a testing program:

1. Use the scientific method

Anthropologists and astrophysicists use rigorous testing protocols to ensure they have a clear line of sight into what they are trying to prove or disprove. They simply have too many variables and too little time and money to not be hyper-focused. Digital tests need not be different.

  • Start with a clear question or goal and formulate a hypothesis.
  • Run the test, unchanged, until a statistically significant number of users have been exposed
  • Test different versions at the same time to control for external variables.

So the simple goal of “We want more newsletter signups” can lead to the hypothesis of “People aren’t seeing the sign up form” or “If the form was more prominent then more people would sign up”. Now you have focus on what to test (variations of sign up form visibility) and clarity for what success looks like (more sign ups).

2. Keep tests manageable

For content and visual tests, stack the odds in your favor by limiting the number of variables tested and maximizing the size of audiences involved. There are a number of factors that impact the time it takes to get to significance including number of visitors, how many people are doing what you want them to do, number of variations tested and the statistical confidence needed for the test. To avoid long testing cycles, keep tests small and manageable.

3. Create a process to implement winners

Once you have found a new champion from testing, work with developers and stakeholders to quickly implement the changes in production . After all, if you found a better design that converts 5% more visitors wouldn’t you want that implemented right away? It also informs future tests and prevents performance issues from impacting testing in the future.

Running frequent, manageable tests in a disciplined manner that specifically aim to prove or disprove a hypothesis and quickly rolling out the winners will help marketers know they are getting the most out of their digital assets.

You may also be interested in

“I use Gartner to bolster my confidence in decision making.”

Stay smarter.