A/B Testing industrialization : Guidelines

72%* of the companies consider User Experience optimization is a priority, to put theory into practice, we need a robust methodology. Now let’s check out what we can do by going through the main steps of A/B testing guidelines.

72%* of the companies consider User Experience optimization is a priority, to put theory into practice, we need a robust methodology. Now let’s check out what we can do by going through the main steps of A/B testing guidelines.

STEP 1 – PERFORMANCE ANALYSIS

To discover what really matters is the first gate to the effective and efficient tests.

FIND YOUR PEOPLE

Before implementing any actions, we need to identify the Project Leader for testing implementation & reporting process as well as Data Owners for technical examination & testing setup.

Tips for starter:

  • Integrate the reporting process with your Performance Steering / Business committee.

DETERMINE YOUR OBJECTIVES

It is essentially important to define business objectives & KPI’s baselines and targets before auditing website performance with which we could identify the real problems and axes of improvement:

  • What could be the business objectives?

Lead Acquisition, Lead Conversion etc.

  • What could be the goals of my website?

To acquire qualified traffic, to convert traffic into leads etc.

  • What could be the KPIs?

Volume of Visits, Volume of Leads, Conversion Rate (CR), Cost Per Lead etc.

  • What could be the targets?

A 20% increase of volume; a 5% improvement of CR etc.

Tips for starter:

  • Focus on one main goal, distracted by several goals might lead to low efficiency

SETUP YOUR ANALYSIS

To begin with, we need to determine analysis approaches and corresponding analysis tools. There are mainly two analysis approaches that we can leverage:

  • Web Analytics Analysis

Web Analytics Analysis could help us to quickly identify the problematic points of the customer path or conversion funnel across the website by looking into conversion-oriented metrics. For instance, with Google Analytics we can visualize the conversion funnel of each page, and by looking into the entrance rates / exit rates of principal pages, we know where to explore further.

Web Analytics Analysis

  • Mouse Tracking Analysis

– Mouse Tracking Analysis is to leverage behavior-oriented data to find the blocks/elements on certain pages which could be improved.

– Use simple tools ex. Hotjar to record and visualize user’s browsing pattern with heatmaps to directly find the UX problems.

– Use also Form Analysis to improve online form completion rates by discovering which fields take too long to fill, which are left blank, and why our visitors abandon the form and page.

Mouse Tracking Analysis

After selecting proper analysis approach (the one of the above or the combination of the two), we should conduct pre-analysis technical examination & check-up tracking tags to make sure the website is under healthy condition to run the tests.

Tips for starter:

  • Run technical analysis on every page: cross-browser, cross-device, and speed test etc.
  • Check-up if the tracking tags are well implemented: a tag from GA and a tag from the tool generating heatmaps.

ANALYSE YOUR PERFORMANCE

After the analysis environment is well setup, give it 1 or 2 weeks (depending on your audience volumes) before collecting the data. Now what we should do it to construct performance reports based on scope (site-wide, page-wide or even CTA-wide etc.), conversion funnels etc. to explore potential areas of improvement.

Tips for starter:

  • Categorize reporting on different browsers, devices (desktop/mobile), and products(with different conversion funnels).

STEP 2 – TEST PRIORITIZATION

With limited resources, prioritize all the potential tests and choose the most impactful ones.

WHERE TO TEST

First of all, we need to prioritize the page(s) to test on with the help of our analysis results in STEP 1. Keep in mind 2 principles with the help of the prioritization model example below:

  • Prioritize pages with high traffic significance: Most-visited pages, top landing (entry) pages, pages with expensive visits
  • Prioritize pages with high potential for conversion improvement: Top exit pages, pages with high funnel drop-off rates

Page Prioritization Model

Click Here to use the free Page Prioritization Model (Tab – Page Prioritization).

Tips for starter:

  • Separate the page prioritization process by devices: Desktop / Mobile.
  • Pay attention to the categories of page when giving scores to each KPIs: ex. For a high % Exit (indicating a high potential for conversion improvement) of “thank-you page”,we should score it low instead of high since it is already the end of conversion funnel.

WHAT COULD BE TESTED

After prioritizing the page(s) on which to run the tests, the next step is to brainstorm and list all the test hypothesis, for example: “adding a contact form at the bottom of page /metiers/marketing-digital will increase the CR from visits to leads”.

WHAT TO TEST

After listing all the test hypothesis, it is time to evaluate and prioritize test hypothesis. Here is a helpful scoring system to rank all the possible tests hypothesis in terms of their:

  • Noticeability: is the change above the fold? Noticeable in 5 seconds?
  • Impact on performance: is the test running on high traffic page with high exit rate? Is it likely to increase the CVR?
  • Analysis supported: is the before/after analysis of tests supported by analytics or mouse tracking analysis?
  • Practicality: can the test be efficient enough to be implemented?
Test Prioritization Model Click Here to use the free Test Prioritization Model (Tab – Test Page 1).

The tests hypothesis with the highest total scores will be prioritized to operate.

Tips for starter:

  • Control the number of A/B tests chosen (whether on the same or different pages) in each testing period (around 3 weeks) under 2.
  • To use the Prioritization Model: The first tab is for Page Prioritization before choosing the detailed testings; then create separate tabs for each chosen page(s) for Test Prioritization (ex. tab Test Page 1).
  • Communicate with responsible persons for testing operation, analytics, and technical set-up to make sure the tests are runnable.

STEP 3 – TESTING

A good testing set-up and planning helps to make a good test.

SELECT YOUR TEST TYPE

First question to ask ourselves, should we do A/B/N Split Testing or Multivariate (MVT) Testing?

Here are some criteria which might help to choose between A/B Split Testing and Multivariate Testing:

  • Overall objectives:

– To test partial change of the website it is better to use A/B/N Split Testing

To test novel interface with combined element changes Multivariate (MVT) Testing

  • Traffic volume Multivariate Testing only applicable for large traffic site (applicable for AXA HK with ~ 1000 daily unique visits)

CHOOSE YOUR TESTING TOOL

Knowing which type of test to implement, it is also essential to choose a testing tool, here are some 3rd party A/B Testing tools recommended:

  • Visual Website Optimizer
  • AB Tasty
  • Adobe Target
  • Google Analytics (Need to have separate pages/URLs, no installed function for variation creation)

SETUP YOUR TESTS

In general, the testing process is splitting website traffic between various versions and measure their performance, compare, then select the best performing version to adapt.

To set up testing campaign on the chosen tool, we could go step by step:

  1. Select test campaign type (take A/B Testing as example)
  2. Enter test page URL (the page selected in previous step)
  3. Create variation versions (based on the tests prioritized)
  4. Add conversion goals (to track visits, clicks, form submissions etc.)
  5. Estimate duration of the campaign and allocate traffic
  6. Add tracking code generated automatically on all pages of your website

Tips for starter:

  • Estimate duration of tests and allocate traffic based on the traffic volume and #of variations, to make sure each variation has enough traffic sample: 50/50% allocationfor 3 weeks is suggested.

MEASURE YOUR TEST

To measure the results of the tests, there are at least 3 analysis to be conducted during the tests:

  • Technical check-up analysis (in the first days)
  • Mid-test analysis (to prevent sharp descend of traffic due to fail test)
  • “Sunset” analysis (to give conclusion)

CASE STUDY

A case study from Hyundai:

ase study from Hyundai
  • What was tested?

– New (SEO friendly) text versus control text
– Extra CTA buttons versus no extra buttons
Large photo of the car versus thumbnails

  • A total of 8 combinations (3 sections, 2 variations each = 2*2*2) were generated for this multivariate test;
  • The winning variations increased conversion rate (request for test drive or brochure) by 62% from this multivariate test.

STEP 4 – INDUSTRIALIZATION

Learn from your results and start over the process.

ONCE IT’S DONE

Of course with the “sunset” analysis of the tests, we need to make conclusions and apply the better performing version. Here is an report example for test performance analysis:

report example for test performance analysis

ONCE FOR ALL

To industrialize the A/B Testing efforts, it is necessary to construct an operational timeline involving all the steps from status quo analysis, testing prioritization, testing setup till testing resultsreporting and decision-making on monthly basis:

  • Integrating the test-specific performance reporting with the Step 1 Performance Analysis, we could structure a monthly reporting consisting of:

Test performance analysis: a “sunset” detailed reporting of test efficiency and effectiveness as demonstrated above

Website performance analysis: general comparison of website/page performance before/after tests

  • We also recommend an adaptable model as the operational timeline for all involved tasks across all steps:

A/B Testing Operational Timeline

Click Here to use the free A/B Testing Operational Timeline example.

Tips for starter:

  • Tools setup :

– Start at least 1 weeks before the timeline for the first-time set-up for both Analysis tools and Testing tool;
– Communicate in advance with website developers on the issue of tracking code/tagging.
– Set up for each following test period in testing tool in line with suggested timeline.

  • Adapt the timeline if necessary according to traffic condition (ex. if the traffic volume is exceptionally too low, the testing period should be extended)

For more information, check on our website: Axys Consultants

Wenqian WANG (Digital & Analytics Consultant at Axys Consultants)