Google Analytics

Step-by-Step Guide to A/B Testing with Google Analytics Experiments

guide to ab testing with analytics
Written by Himanshu Suri

A/B testing, a very familiar term, isn’t it? Today, I’ll take you through the step-by-step guide to A/B testing with Google Analytics which is absolutely free to use. You heard me right, it’s FREE!

Marketers have been running for searching some of the best A/B tools in the industry but I can surely bet on with Google Analytics experiments, as one of the best free tools available for A/B testing landing pages and content experiments. *Mind it, I said A/B testing not Multivariate testing.

Step-by-Step Guide to Split Testing

Most of the newbies struggle to find their way to setup content experiments in analytics. You can easily jump to Behaviour > Experiments > Create New Experiment in the google analytics interface.

Google has combined the two terms A/B testing and Split testing into one term known as Content Experiments, so don’t get your self confused here.

So lets start by typing in the “Name for this Experiment”, add a descriptive name to the experiment so that you can later identify it. Your experiment can be a sign up form or a product selling button or may be a complete new landing page as the second version.

Now comes a very important move, you need to select the “Objective for this experiment”. You need to define a metric that you will use to evaluate your test results i.e choose a winner. Your metrics can be ecommerce (revenue, transactions etc. ), site usage (sessions, bounce rate etc.) or your goals.

After setting up the objective, you need to click on advanced options

The “Distribute traffic evenly across all variants” option lets the traffic split across all variants evenly. You can toggle this to “ON” to distiribute even traffic across all versions. If the option is put to “OFF”, google analytics dynamically adjusts traffic according to variation performance.

Setting up the minimum time option lets you set the time period for which the experiment runs i.e the life of a experiment. This can be set to 3 days, a week or 2 weeks at maximum.

Followed by this, you can set up the confidence threshold. This option lets you opt for a minium threshold before the winner is declared. It’s better to set the threshold value to the maximum available so that you have the confidence that the original and new version competed well against each other.

There you go, you are doing it good! Now comes, adding the Original page and the new version of the page. Enter the URL of the original and the new page/pages and give them a name.

Once you are done with all the changes, hit the next step option at the bottom. Now you would see a screen that says “Manually Insert the Code” or “Send the code to the webmaster”.

If you opt for manually inserting the code, analytics would provide you with an experiment code that needs to go on to the head section of both the original and the new page.

At the end, a validation takes place to see if the codes were placed correctly. Once validated, you get a green signal to run the experiment! Cheers! :)

About the author

Himanshu Suri

Himanshu Suri is a digital marketing consultant. He has expertise in online marketing and web analytics. He has worked as a consultant for lot of agencies, private companies and few public figures.