What is an AB Test
The AB Test This is one of the tools traditionally associated with the work of CROalong with qualitative data such as Heat Maps, Session Records and others generated by CRO software such as HotJar, MouseFlow, Yandex Metrica o Lucky Orange.
Although they are often seen as la CRO, AB tests are just part of the conversion optimization work, in particular are part of a testing phase that, in the scientific process that is part of CRO, occurs after other phases such as:
- The definition of the objective (e.g. increase the conversion rate);
- the collection of quantitative data (traditionally through Google Analytics) and qualitative data (with the aforementioned tools);
- the formulation of hypotheses about what to test, the decision of which is made on the basis of certain prioritization framework.
Here is a diagram that summarizes the steps of a CRO process in which AB Testing is also part of the process:
Once these steps have been passed, here we are at the testing phase, of which AB tests are the best known way, but not the only one, that allows us to compare the performance given by the change of an element in the Landing Page or product page of theecommerce with an original, so-called control version, sending the same amount of traffic to both versions.
In this complete guide to AB Tests we are going to go into detail about this tool starting with a distinction between the various types such as AB Test and Multivariate Test.
AB Test, Split Test, Multivariate Test (MVT)
An AB test allows you to test, with equal traffic, 2 alternative versions of a page, where A is the control version and B is the version with a variation. From the outcome of an A/B Test it can happen that:
- the variant generates a statistically significant uplift
- the variant generates a statistically significant worsening
- the test ends with no significant differences between variants
Split Test
Although AB tests are the most popular, there are other forms of testing:
- A/B/n to test more than two variants against each other.
- Split url test: this is similar to the A/B test. The terms are used interchangeably, but they are not the same: a split test compares the performance of distinct URLs, so it is an A/B whose variants are actually URLs, not elements on the page. They're useful for testing a new design.
Multivariate Testing (MVT)
In an A/B test, a specific change is tested. However, in multivariate testing (MVT), a group of items on a page is tested. A group includes two or more items that each have two variants.
Multivariate testing allows you to test multiple elements such as a new headline and a new CTA, example:
- variant A: Headline + CTA control
- Variant B: Headline+ CTA test
- Variant C: Headline test and CTA check
- Variant D: Headline control and CTA test
Unlike the AB test, when you want testing different elements on a page you can use a multivariate testing.
With a multivariate test you don't test 2 different versions of a page as in the AB test but different versions of elements within a single page.
A multivariate test is then used to understand which elements of a page play the biggest role in helping you achieve your goals.
What to test with AB Test in a Landing Page
A question often asked is which item to test with an AB test: actually the answer to this question should come after quantitative and qualitative data have been collected, but in general the items that are traditionally tested an AB test (or similar modalities) are as follows.
Let's review them with some AB Tests.
Headline
Numerous statistics confirm the importance of the Headline in a Landing Page or homepage not only in encouraging the visitor to continue reading the page but also (and perhaps because of this) in making him understand within 5 seconds (timing often used in usability tests) what we do, what is our strength.
The headline is the key part in a landing page. To make you understand the importance of testing the headline, know that statistics confirm how:
- 8 out of 10 people will read your headline, but only 2 out of 10 will read the rest of the article;
- an effective headline can increase the effectiveness of your content up to 500%;
- the headline is able to increase the conversion rate of a site even by 40%;
You can find more information on how to effectively write a headline in this article: Headline: 12 Techniques to Increase Clicks, Shares and Conversions by 500%
The Splitbase Case: +27% Conversion Rate
In this AB Test, based on the high bounce rate of the product page, user behavioral analysis results showing that users do not pass the Above The Fold, and survey results suggesting trust issues, SplitBase hypothesized that working on the Above The Fold by adding a headline that clarified the benefits and addressed the trust concerns of visitors could reduce the Bounce Rate, thereby increasing product sales.
The AB Tests that followed generated a +27% increase in the conversion rate and a +13% of additions to your cart.
The Highrise case: +30% conversion rate
A well-known case history of application of an AB test on the headline is the case of the Highrise software, which tested 2 different versions of headlines, finding that "30-day free trial for all accounts" generated 30% increase in signups compared to the original "Open a Highrise Account":
The CareLogger case: + 31% conversion rate
Another example of applying an AB Test to a headline comes to us from CareLogger, who increased their conversion rate by 31% simply by changing their headline:
The reason? Probably, as we will see below with another example, the shift in focus from disease ("Diabetes") to "Optimal Health".
Subtitle
The subtitle, in a landing page, takes the problem exposed in the headline and anticipates the solution.
An AB test on the subtitle of our landing page can allow us to understand which one performs better. The proof comes once again from Highrise: the same headline, with a different subtitle, allowed an increase in the rate of registrations from 27% to 30%.
Just changed the subtitle from "Pay as you go. No Long term contracts. No Hidden fees. No surprises" to "Signup takes less thank you 60 seconds. Plick a plan to get started". How can we explain this improvement?
Probably by switching from a subtitle that expresses negatives to a subtitle that expresses benefits, with a clear call to action.
Placing the user's attention on potential problems (long-term contracts, hidden fees, unwelcome surprises) in the subtitle, even to deny them, may have had a worse effect than shifting the focus to a clear benefit (you sign up in less than 60 seconds) and an equally clear call-to-action (pick a plan and get started).
CTA (Call To Action)
The call to action on a page, not only because of its location in the Above The Fold, i.e. in the most visible area for the user, but also because of its function as a "call to action" for conversion, is an element whose modification can have a significant impact on conversions.
What is typically tested about a CTA? The answer is of course it depends, on what emerges from the analyzed data and hypotheses, but traditionally it can relate:
- the text;
- the color;
- the form;
- size;
- the position on the page.
From the point of view of shape and color we remember the importance of applying a well-known principle of neuromarketing, the inconsistency of shape and color, which allows the CTA to stand out on the page.
BannerStack: +25% conversion rate
BannerStack, based on data pulled from Heat Maps, noticed that people were interacting with additional page elements more than the main CTA: based on the information, a larger CTA button and extra contrast was implemented, increasing signups by 25%.
Soocial: 14.5 to 18.6% conversion rate
Can a simple word addition increase the conversion rate of 28% in a call your action? It would seem so, at least according to theAB Test performed by Soocial: the simple addition of "It's free" allowed them to go from a 14.5% to an 18.6% conversion rate.
Carelogger: +34% conversion rate
You don't have to test the text of your call to action to get benefits: you can even change the color of the button, as shown by CareLogger, which improved the conversion rate by 34% simply by testing a green and red button:
Performable: +21% CTR
A similar result from Performable, which testing 2 versions of the call to action button had a 21% increase in CTR on red button:
What does this latest AB Test teach us? That red is always good?
No! It teaches us not to be prejudiced: the color red is traditionally associated with danger, with error. Only by testing without prejudice these 2 versions with a split test, Performable was able to find the best performing version.
Length of a Landing Page
Does a short or long landing page convert more? It depends. One of the ways to understand this is with an AB Test.
It is usually recommended to use a landing page whose length is proportional to the complexity of the offer we propose:
Despite this suggestion, a split test can help us understand the optimal length for our page.
Moz managed to increase subscribers to your software by up to $1 million more per year working on some changes that have involved, among other things, the length of the sales page:
Hero Shot or Hero Images
They are the representative image of the product or service: if this element is represented by the image of a person, such as a face, numerous AB tests have highlighted the importance of the direction of the eyes.
Basically, orienting the hero image in the direction of important page elements such as the Headline or CTA would allow you to direct the direction of the user's eyes towards this element, as you can see from this AB test:
Elements of Trust
Adding Social Proof or credibility elements on the page can reduce user FUDs (fears, uncertainties, doubts) increasing the chances of conversion.
If from the data I were to hypothesize and then want to test some Trust elements, such as adding badges (secure site) in ecommerce, reviews and logos of reputable client companies.
Form
The form, as a contact form of a Landing Page or as a registration form of an ecommerce, is one of the elements that determine the macro or micro conversion: an AB test can allow, for example, to validate the hypothesis that too many fields are correlated with a lowering of the conversion rate or simply the user does not understand what to write: in this regard I refer you to the guide on the Form Analytics and the 10 Strategies (+ 3 Tools) to Improve Your Contact Form (and Get + Customers).
What can you test in a contact form? Here are a few ideas.
Test the amount of fields you require
Statistics confirm how the more information you ask people to fill out, the less people fill it out.
You could test for example:
- 2 different versions of your contact form, one with more and one with less fields to fill out;
- a split contact form, i.e. divided in 2: according to R. Cialdini's persuasive rule of Commitment and Consistency, when people start a task, they tend to complete it. In this regard, I recommend you to read my article: Cialdini's 6 Weapons of Persuasion (and how to use them in Web Marketing).
Neil Patel testing 2 versions of his contact form, one with 3 and one with 4 fields, managed to increase the conversion rate of your 27% contact form simply by reducing the "online revenue" field:
Test the position of your contact form
Where do we put the contact form on a landing page?
Top right, strictly above the fold, right?
An AB Test performed by Content Verve brought an increase in conversions of 304% by placing the contact form at the bottom of the page:
Other types of AB Test
AB test in Email marketing
Although technically AB tests refer to tests performed on pages of sites, an AB test in theemail marketing is a strategy that allows you to test 2 different versions of objects, pre-headers or email text to understand which one performs better.
These tests, which can be performed for example in software such as SendInBlueLike classic AB tests, they require statistical significance, in this case obviously given by the sample of emails tested, but in any case they can give data-driven suggestions to optimize the KPIs of email marketing.
What is the element that most affects whether the recipient of your email opens it and reads it or trashes it directly?
The subject of the email and the sender (along with the preheader) which are the 2 main fields that the recipient of your email sees when they receive an email.
Okay, but What is the item that works best, i.e. that makes the newsletters you send open the most? There are various techniques to improve the subject line of your emails to improve open rates (I recommend you read my article on this subject: 11 Strategies for Your Newsletters: A Guide to Email Marketing) but you know
What's the best way to find the subject that opens the most emails? Test it with an AB test. Click To TweetHow to do a split test with Mail Chimp
Yes, you can do an AB Test with any email marketing software, like SendinBlueLet's see how to do it with Mail Chimp in 4 steps.
Step 1
First, you can decide whether to test 2 or more between:
- objects of the email;
- sender;
- content;
- sending time.
Step 2
Once you've chosen what you want to test, you can decide how large the first group you'll send version A and B to is: Mail Chimp by default recommends you send to 50%, which means you'll send to 25% of your contacts version A and the remaining 25% version B.
Step 3
Mail Chimp will determine the winning version after a number of hours, by default 4, but you can decide how long to wait: Zapier recommends waiting up to 4-5 days before determining the winner of our AB Test.
Step 4
Once the winning version is determined, e.g. the object that generated the most opens, Mail Chimp will send it to the remaining group (the 50% for example).
AB Testing in Popups
OptinMonster is the software I've been using for a long time for increase newsletter subscribers.
For every function, OptinMonster allows you to test it with an AB Test.
For example, you can test:
- the color of the button;
- the text of the pop-up;
- the number of fields.
AB Test in Facebook Ads
Of course you can also test 2 or more ads, for example of Facebook Ads, e Google Adsin order to see which of the 2 performs better in terms of CTR.
AdEspresso, thanks to an AB test with Facebook Ads was able to increase the conversion rate, the CTR and decrease the CPL (cost per lead):
You can run an AB test with Facebook ADS with tools like:
- Wishpond;
- Adespresso itself.
AB Test in Google Ads
You can do a split test of your Google Ads:
- on Search Network by rotating 2 ads similarly, then comparing them with a tool like Splittester by Perry Marshall.
- on Display Network with Perfect Banner;
AB WordPress Test: Plugin Nelio Software
You can also do an AB test with WordPress thanks to plugins like Nelio ABTesting (https://neliosoftware.com/testing/).
Tools for AB Test
How can you make an AB Test or split test with a Landing Page? Basically in 2 ways:
1) With what software specialized in AB Testing, such as:
- VWO
- Convertize
- Google Optimize
- Optimizely
- AB Tasty
- Convert
Find a full list of tools to perform AB Test here.
2) with the AB testing tools provided by some sites that help you build Landing Pages such as Unbounce, Instapage, Landingi.
Where to find examples of AB Test? Guess The Test
Although the decision to test a particular must be taken on the basis of quantitative and qualitative data and consequent hypotheses, the AB tests present online can provide interesting ideas to apply some best practices that have performed well in the context of Landing Pages or Ecommerce or to have some hints.
A detailed source of AB Test is Guess The Test (www.guessthetest.com), a site created by Deborah O'Malley, a CRO professional, which collects hundreds of AB tests divided by type of goal to be achieved, scope of the test.
Is your AB Test statistically significant?
Let's assume you tested 2 different versions of your landing page with a split test or a multivariate test.
When are the results obtained reliable? In other words, how can I ascertain the validity and statistical significance of my test?
2 tools can help us: let's see them together.
Validity and statistical significance of a Split Test
VWO A/B Split Test Significance Calculator
Starting from the number of visitors and conversions, it allows you to understand if the result obtained is statistically reliable or not.
A/B Split and Multivariate Test Duration Calculator
How long does our AB Test have to take to be meaningful?
This tool gives us the answer, simply by entering some data such as the estimated conversion rate, number and percentage of visitors included in the test.
Want to learn how to do CRO with a training course or create an AB test for your business? Contact me: https://emanuelechiericato.com/