Blog Post - Christopher Longman, Mar 24 2015

AB Testing: how to optimise your website

AB Testing: how to optimise your website

Why test…

Within many companies there are often many opinions on how your website should look, what the functional purpose of a link or button should be, what the user journey looks like and the critical statement, that ‘whatever is released MUST have a positive impact on sales’.

But how do you know what will work best? Testing takes away the guessing game, you no longer have to go on gut feel; you can test various different versions against multiple goals on a small volume of traffic, so the risk is reduced.

Deciding what to test…

Firstly decide on:

  • What metrics are relevant to test against?
  • What does success look like for your business?
  • What does success for the test look like?

Sales and conversion aren’t always relevant to the test. At Salmon we have performed tests to understand if changing the text of a button saw an increase in click through rate (CTR) and conversion. This allowed us to isolate positive changes that impacted sales vs positive changes that impacted the user journey.

Analyse the data you already have. What is going to have the biggest impact on your KPIs? Homepage, lister pages, product pages and checkout pages tend to have the biggest impact on sales. You may not have a sales website, therefore what are your equivalent of product pages? If only 0.5% of your users are seeing that page and 0.001% of conversions on the site are influenced by that page, then it is not a high priority page. However, if you know that 50% of users view a product page and that only 0.5% of those users convert, this becomes a high priority page.

By analysing data such as the above, I can understand that sorting and size are the most important areas to concentrate on, colour is less important.

Come up with hypotheses about what users are trying to do and what is of greatest importance to them. The internet is a fast moving environment, where people can access information quickly and easily. If a process is too complex or a message is unclear, the likelihood is that you are already increasing the chances of that person becoming disengaged with your brand before they have started a purchase.

Meet with the business stakeholders with a pre-defined agenda. Start with the data and then get ideas from the key stakeholders (insight/analytics, content, design, user experience, development, management) as to how to solve the problem that the data has highlighted. Some problems can be discussed until you’re blue in the face, so make sure that solutions are being provided, rather than problems. Keep an eye on the time. If you have 4 items on the agenda and only an hour to discuss possible solutions, then each item should only have 15 minutes to be discussed.

How to test…

You will need an AB testing tool. Web analytics tools are good at understanding how much, but are not good at quantifying whether the changes you’ve made have had a positive or negative impact. There are a number of different AB testing tools on the market with different offerings and price ranges. All will require some code (a java script snippet) to be added to the site and some tools need more work to make the most of their software.

Agree on the designs that will be tested and develop previews of the test in an AB testing tool. Here’s an example of two different buttons tested by one of our clients:

Decide how much traffic will be included in the test. This will depend on:

a). how big a risk the variation(s) are

b). how quickly you want to see results.

If the answer to a). is it’s a low risk and b). you want to see results quickly, then you’ll need 90-95% of traffic viewing the variant.

Push the test live and monitor.

Results…

Often when a test goes live (within the first 10 days) there can seem to be a negative impact. This is not unusual and can just be down to insufficient data. Take into account the confidence levels seen within the test before making a knee-jerk reaction and switching the test off. Most tests take at least 2 weeks to return a significant result.

If a positive result is shown, push the results to 100% while development happens in the background to push the new changes into a release to make them permanent.

Consider releasing tests for different segments e.g. platforms (desktop, tablet, mobile, other) once the original test has completed. These tests can provide marginal gains as you start to segment the audience further.

Salmon actively supports clients to make the most of their testing strategy. Our Insight team are at hand to help you select the tool that best meets your needs and budget. We have expertise in delivering an optimisation strategy involving AB testing and can provide best practice advice.

Get in touch today to find out how you could benefit from AB testing.

PS And if you’re wondering which of the button options was more successful, it was the purple one!