Skip to content

A/B Testing - Website Experiments

Website Testing Using JavaScript

This session covers how to run experiments (A/B tests) on your website using the so MVMCloud Analytics JavaScript Tracker.

We will learn how to incorporate the A/B testing framework into your website, how to incorporate and implement your experiments using the best practices and what to do when an experiment is complete.

Creating an experiment

Read the A/B testing guide to learn more about creating an A/B testing experiment.

Incorporating the A/B Testing JavaScript Framework

When you install the MVMCloud Analytics javascript tracking code on your website and subsequently create an A/B test, the code generated is automatically incorporated into your website.

Loading Tracking Sooner

The MVMCloud Analytics tracking code that you install on your website, by default, is loaded asynchronously in the user's browser. This is done like this way to prevent this code from interfering with the page loading speed. However, when you are performing an A/B test it is recommended that the test starts as quickly as possible to prevent any flickering/intermittent content from delaying the start of the test. Therefore, to avoid the situation described, the code Tracking needs to be loaded synchronously during testing, the file that is loaded is tracker.js. To do this you need Modify the embedded code on your website as follows:

  • Move the MVMCloud Analytics tracking code that loads the tracker.js file to your site's <head> tag, if it isn't already there;
  • Load file synchronously instead of asynchronously:
  • Removing the two lines that contain:
var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0];
g.async=true; g.src=u+'js/tracker.php'; s.parentNode.insertBefore(g,s);
  • Adding the following line after the closing element:<script type="text/javascript" src="//url-do-mvmcloud-analytics/js/tracker.js"></script>
  • Your tracking code will look like this:
<!-- MVMCloud Analytics2 -->
<script>
   var _paq = window._paq = window._paq || [];
   /* tracker methods like "setCustomDimension" should be called before "trackPageView" */
   _paq.push(['trackPageView']);
   _paq.push(['enableLinkTracking']);
   (function() {
     var u="//url-do-mvmcloud-analytics/";
     _paq.push(['setTrackerUrl', u+'js/tracker.php']);
     _paq.push(['setSiteId', your-site-ID]);
   })();
</script>
<script type="text/javascript" src="//url-do-mvmcloud-analytics/js/tracker.js"></script>
<!-- End MVMCloud Analytics2 Code -->

Incorporating an experiment

When you create an experiment in your MVMCloud Analytics, the JavaScript code that will run your experiment is generated for you. experiment and that you need to embed on the pages of your website that will be part of the experiment. The code usually looks like this:

<!-- MVMCloud Analytics2 A/B Test -->
<script type="text/javascript">
     var _paq = _paq || [];
     _paq.push(['MvmAbTesting::create', {
         name: 'Landing-page-test', // you can also use '2' (experiment ID) to hide the name
         percentage: 100,
         includedTargets: [{"attribute":"url","inverted":"0","type":"any","value":""}],
         excludedTargets: [],
         variations: [
             {
                 name: 'original',
                 activate: function (event) {
                     // normally nothing needs to be done here
                 }
             },
             {
                 name: 'Variation1', // you can also use '2' (experiment ID) to hide the name
                 activate: function(event) {
                     // eg $('#btn').attr('style', 'color: ' + this.name + ';');
                 }
             }
         ],
         trigger: function () {
             return true; // here you can further customize which of your visitors will participate in this experiment
         }
     }]);
</script>
<!-- MVMCloud Analytics2 A/B Test -->

In this example, an experiment is created using the _paq.push method and several properties of the experiment are defined. This experience code will be generated for you in your MVMCloud Analytics. For better understanding, here is a explanation of what these properties mean:

  • name- The name of the experiment as configured in MVMCloud Analytics. If you prefer not to expose your experiment name to your users in the DOM, you can also use the experiment ID. You can find the ID of an experiment in the list of all experiments in MVMCloud Analytics;
  • includedTargets - Specifies which landing pages the experiment should be activated on. For an experiment to be activated, all rules must match (logical AND) and none of the excluded targets can match;
  • excludedTargets - Specifies which landing pages the experiment should not be activated on. If any of the given rules match (logical OR), the experiment will not be activated even if all included targets match;
  • variations- The list of different variations you want to compare. Experiments can be created for several variations (A/B), not being restricted to just two;

Note that you cannot simply add more variations to this JavaScript code. When MVMCloud Analytics receives a trace request for an experiment, it only accepts preconfigured variations. To define more experiment variations or change an existing variation, edit your experiment in MVMCloud Analytics.

Optional experiment properties

There are more properties that can be configured when you create an experiment in MVMCloud Analytics. These properties are optional:

var _paq = window._paq = window._paq || [];
_paq.push(['AbTesting::create', {
   // [...]
   percentage: 100,
   startDateTime: '2017/08/25 00:00:00 UTC',
   endDateTime: '2020/05/21 23:59:59 UTC',
   trigger: function () {
       if (isLoggedIn && userAge < 50) {
           return true;
       }
       return false;
   },
   variations: [
         // [...]
         {
             name: 'VariationA',
             percentage: 40,
             activate: function(event) {}
         }
     ]
}]);
  • percentage- A percentage of how many users should participate in this experiment. By default, 100% of your users will participate in your experiment and see the original version or any of its variations;
  • startDateTime- If configured, the experiment will not be activated until the specified start time;
  • endDateTime- If configured, the experiment will no longer be activated after the specified end time;
  • trigger- The trigger function allows you to further restrict which visitors will participate in your experiment. For example, if you want to run the experiment only for visitors from a specific country, or you want to enable the experiment only on a certain type of page, you can use this method to customize who will participate;
  • variation.percentage - By default, each variation gets the same amount of traffic, but you can allocate more or less traffic to individual variations. You don't need to set a percentage on every variation. If a percentage is specified for only some variations, all other variations will share the remaining percentage equally. For example, if you specify that VariationA should get 40%, the original version and VariationB will share the remaining 60% and will be seen by 30% of your traffic each. We recommend not assigning more than 100% in all of your variations.

Implementing an experiment

To summarize what we've learned so far:

  • To avoid content fluctuations, the tracker.js file should be loaded synchronously as early as possible;
  • The experiment code generated by MVMCloud Analytics must be copied and pasted into the website;

Now you need to actually implement what should happen when a variation of your experiment goes live. All you need to do is implement the activate method for each of your variations.

Example trying different colors of a button

For example, if you want to compare buttons of different colors, you can implement the activate method as follows:

variations: [{
   name: 'blue',
   activate: function(event) {
       document.getElementById('btn').style.color = '#0000ff';
   }
},
{
   name: 'red',
   activate: function(event) {
       document.getElementById('btn').style.color = '#ff0000';
   }
}]

About the activate method

Inside the activate method, the this context variable is within its range. This means you can access your variation name via this.name.

An event is passed to the activate method which allows, for example:

  • Access your experiment instance via event.experiment;
  • Redirect users via event.redirect(url);
  • Define a function that should be executed as soon as the DOM is ready via event.onReady(callback);

If you access the DOM using jQuery or another library, make sure that library is already loaded when the experiment is launched.

Test Variations

Testing variations can be tricky because variations are activated randomly and you always see the same variation. To test a specific variation, you can append a ?pk_ab_test=$variationName URL parameter. This will ensure the activation of the given variation, even if the experiment should not yet be triggered due to a defined filter. He too will not track any experiment activations in your MVMCloud Analytics so your data is kept clean.

If you are running multiple tests on the same page, you can enable multiple variations by specifying the variation names separated by comma: ?pk_ab_test=$variationName1,$variationName2.

Track a goal manually

When comparing different variations, it is often necessary to track goals to decide which of the variations is the most successful. When setting up your experiment, you can assign multiple goals as a "success metric." These goals are usually converted automatically without having to do anything, but you can also track a goal conversion manually like this:

variations: [{
   name: 'blue',
   activate: function(event) {
       var button = document.getElementById('btn');
       button.style.color = '#0000ff';
       button.addEventListener('click', function () {
           var idGoal = 5;
           event.experiment.trackGoal(idGoal);
       });
   }
}]

Impact of ITP (Intelligent Tracking Protection)

MO MVMCloud Analytics stores the selected variation in local storage to remember which variation was activated for a specific visitor. Since Safari 13.1, Safari deletes all locally stored data after seven days. This means that if a visitor does not visit your website for seven days, the activated variation will no longer be remembered, and the next time the visitor visits your website, a new variation will be selected randomly. If the visitor returns to your site within seven days, we try to extend the lifetime by another seven days. We do not expect this behavior to bias or invalidate the results.

If you want to exclude Safari when running A/B tests, you can add the following code to your tracking code:

var _paq = window._paq = window._paq || [];
_paq.push(['AbTesting::disableWhenItp']);

Be sure to call this method before defining any experiments.

Avoiding Content Oscillations

When comparing, for example, different button colors to see which color converts best, you may run into a problem in that first the original color is shown for a few milliseconds before the color is changed to the color of a variation. This is known as flickering or flashing of content.

To avoid this flickering, it's important to place the experiment tracking code in the correct position in your site's source code.

Single Page Applications

Single-page websites and Progressive Web Apps have become a standard in recent years. Correct tracking of these websites and applications is crucial to your success as you need to ensure the data you measure is meaningful and correct.

Track new page views and ensure A/B testing works

The challenge starts when you need to track a new page view. A single page application is different from a regular website, as there is no regular new page loading and MVMCloud Analytics cannot automatically detect when a new page is viewed. This means you need to tell MVMCloud Analytics whenever the URL and page title changes and embed the code A/B test again. You can do this using the setCustomUrl, setDocumentTitle and AbTesting::create methods like this:

window.addEventListener('pathchange', function() {
     var _paq = window._paq = window._paq || [];
     _paq.push(['setCustomUrl', window.location.pathname]);
     _paq.push(['setDocumentTitle', document.title]);
     _paq.push(['AbTesting::create', {
         name: 'theExperimentName',
         includedTargets: [{"attribute":"url","type":"starts_with","value":"http:\/\/www.example.org","inverted":"0"}],
         excludedTargets: [],
         variations: [
             {
                 name: 'original',
                 activate: function (event) {
                     // normally nothing needs to be done here
                 }
             },
             {
                 name: 'blue',
                 activate: function(event) {
                     // eg $('#btn').attr('style', 'color: ' + this.name + ';');
                 }
             }
         ]
     }]);
     _paq.push(['trackPageView']);
});

Load tracking code synchronously

As mentioned earlier in this guide: it is highly recommended to upload the tracker.js file synchronously in the HTML <head>.

Decide where to place the experiment code

We recommend pasting the experiment's JavaScript code right after the HTML element you want to change.

If you cannot place JavaScript code in the middle of your HTML, you can also place the code directly in the <head> HTML element and perform the actual change once the DOM is ready:

{
   name: 'blue',
   activate: function(event) {
       event.onReady(function () {
           document.getElementById('btn').style.color = this.name;
       });
   }
}

Using the onReady DOM event can be problematic as there may be other registered DOM ready events that run before your experiment is activated. If you use the event, be sure to place the experiment code as high as possible in the <head> HTML element. This will ensure that your event takes place first.

Consolidate your CSS changes

When you are changing many CSS styles at once, it is recommended to change them all at once, like this:

{
   name: 'blue',
   activate: function(event) {
     document.getElementById('btn').style.cssText = 'color: blue; font-size: 15px;';
   }
}

Alternatively, you can use CSS classes to change multiple CSS styles at once:

{
   name: 'blue',
   activate: function(event) {
     document.getElementById('btn').className += 'myClass';
   }
}

Use Vanilla JavaScript instead of jQuery or other libraries

If possible, try to use native JavaScript code in your variations. Using document.getElementById('btn') will be faster than jQuery('#btn'). If you need to support older browsers, we recommend testing that variations are enabled correctly in those browsers.

Remove your experiment code from a tag manager

We recommend removing the tracking code from a tag manager if you use one and pasting the experiment code directly onto your site. If you can't move it out of the tag manager, make sure your experiment is configured to load synchronously in the tag manager.

Agree the order of your experiments

If you run multiple experiments on your site, it can be useful to first generate JavaScript code for experiments that affect the elements that are above the fold (First visible part of the screen to the user when a website is viewed). This ensures that your experiments running above the fold run first and that code for experiments not visible above the fold runs later.

Hide the <body> element until the variations are executed

If you have persistent screen flickering that you can't resolve every time the page is viewed, consider hiding the entire <body> until your experiment runs. This is usually not necessary, but it can be a solution if the other options don't help.

Finishing an experiment

When an experiment is completed:

  • Remove the experiment code from your site to ensure your visitors no longer opt-in to your experiment. This is recommended even if you have scheduled an end date.
  • If your experiment proved that one of your variations performed significantly better than the original version, you probably want to change your website or app and permanently implement the winning variation.

Happy experimenting!