back

A/B and multivariate testing for success: Part II

Nathan Cocks
  8 minute read

A/B testing is a key part of optimising your conversion rate and website performance, but how exactly do you do it?

In Part One we explained what testing is, why it’s important and the tools you need to get started. So if you haven’t read that, it’s worth going back and starting there.

Here in Part II we’re going to explore the ins and outs of how to go about testing your content.

Big vs small changes – Which is best?

One of the biggest mistakes people make when starting on their testing journey is thinking too small.

Unless you know for a fact your content is performing exceptionally well, you should start with A/B experiments that test drastic changes to your website.

Starting out by testing drastic changes has two key advantages:

  • Testing big changes helps you quickly establish a strong conversion foundation.
    Save subtle changes for polishing strong conversion foundations. When you are starting out, you want to work out what a strong foundation looks like as quickly as possible. Don't be afraid to shake things up in your test designs, it could lead to conversion performance you hadn't previously believed possible.
  • Tests with drastic changes resolve faster than those with subtle changes.
    The amount of data (i.e. visitors) you require to be confident your test results will be reflected in a live environment is primarily determined by the magnitude of the performance difference between the test variation and the original page design. All other things being equal, the bigger the performance change, the less data you need to reach statistical certainty. To illustrate the issue more clearly, assume we want to run two separate tests on a landing page with a conversion rate of 2%.

    The first test is a total overhaul of the page content and design, while the second is far less drastic, only changing some key copy throughout the page.

    We run both tests until we reach 95% statistical confidence in the outcome and get the following results:

    Test one:
    Base conversion rate: 2.00%
    Test variation conversion rate: 3.00%
    Performance increase: 50.00%

    Test two:

    Base conversion rate: 2.00%
    Test variation conversion rate: 2.40%
    Performance increase: 20.00%

    But how much traffic did we need to reach 95% confidence for each test? The maths of statistical confidence and sample size gives an answer that may surprise you.

    Test one:
2,900 visitors

    Test two:

    20,000 visitors

    Despite the first test outperforming the second by 150%, the second test would require 700% more traffic before you could be as confident in the outcome as the first.

    Small changes need more time in testing. In the time it takes to be confident in the results of testing a small change, you could have tested bigger changes several times and achieved more significant results.


It bears repeating, the above is not a criticism of multivariate testing, or even testing subtle changes. But these activities should only occur when you are confident that your content foundations are already performing well. Don't waste time testing for marginal improvements until you have exhausted your options for finding major ones. Start with testing big changes with A/B testing and once you feel your foundations are right, move on to testing more subtle changes or even engage in multivariate testing if your traffic levels allow for it.

How to conduct a website test: Step by step

Conducting an A/B Test

1. Identify Your Goal
Determine the specific goal of your A/B test. Are you trying to increase clicks, conversions, or engagement? Having a clear goal will guide your testing process. In addition, it helps you focus on testing changes that are more likely to create the kind of performance improvement you are hoping for.


2. Define your audience
Who are you trying to improve performance for? It may be for everyone who accesses the page you want to test, or you may be specifically focused on a subset of users such as past purchasers, or people coming from a specific marketing campaign. Most testing platforms these days include a variety of targeting tools to help you target your testing appropriately.


3. Create your variation(s)
Now you know both who you are targeting and what you want them to do, it is time to design your test variations. Design variations that you believe will support your identified goal(s). Don't fall into the trap of testing things just because they represent a change. Testing isn't just about making changes, it is about making changes that benefit your business.Don't test anything unless you can make a convincing argument for why it may perform better than what you already have. Anything else is just wasting your time.


4. Implement and run the test
This is the easy part. Implement your test within your favoured test platform and leave it to run until it hits statistical significance. Depending on the specifics of the test you are running and the amount of traffic it is exposed to, this can happen in a matter of hours, days, weeks or even months. Again, when starting out it is best to test big changes so you can more quickly identify improvements and start seeing benefits sooner rather than later.


5. Implement the Winner
If one version outperforms the original, implement the winning version as the new standard.


6. Learn and Iterate
Use the insights gained from the test to inform future optimisation efforts. Continue testing and refining to achieve ongoing improvements.

Conducting a multivariate test

Conducting a multivariate test is handled in almost the exact same way as implementing an A/B test. The differences come in how you define your test variations.

When defining your test variations with a multivariate test you don't take a whole of page approach. Instead, you identify those areas of the page you believe have an impact on conversion and then create test variations of those individual elements.

Your multivariate testing platform will then mix and match all the different element variations on your behalf.

As with A/B testing, it’s important to ensure you’re testing changes you believe will have a positive benefit on your site. It's arguable that it’s even more important with multivariate testing because it frequently takes much longer to get a result. There are few things worse than running a test for months only to find no improvements were made.

What should I test?

The most important parts of a page to test are dependent on a range of factors including your test goals, the purpose of the page you are testing, and its current design.

However, the following list of ideas will help you get started whether you are conducting an A/B test or a multivariate test; keeping in mind that when conducting an A/B test (at least to start) you should be aiming to test drastically different designs that will likely include changes to a number of different page elements. Of course, if you want help in making sure you are testing changes with a high chance of success, don't hesitate to reach out to us.

Headlines

  • Shorter, succinct headlines vs longer more descriptive headlines
  • Headline vs no headline
  • Different headline styles
  • Headline position
  • Subheadings vs no subheadings
  • Switching headline vs subheadline order


Call to action buttons

  • CTA button text (e.g. "Add to cart" vs "Buy now")
  • CTA button style (size, colour, shape)
  • CTA position (top of the page, after introductory text, etc.)
  • Including two or more CTA buttons vs one


Page design

  • Light vs dark schemes
  • Navigation position
  • Font size, type and colour
  • High contrast vs low contrast


Text

  • Long vs short
  • Bullet points vs longer form text
  • Removing less essential text vs keeping it in


Social proofs

  • Review ratings vs testimonials
  • Text testimonials vs videos


Media

  • Video vs static images vs audio
  • Test different media against each other (e.g. if you have two videos that could work on your target page, test which one is most effective.

Conclusion

Remember, the key to effective A/B testing lies in setting clear goals, and designing test variations that are likely to meet them. So, whether your business is new to the online space or you have been around for a long time, A/B and multivariate testing have the potential to revolutionise your online performance.

Let's work together