You have probably done a science experiment at some point in your academic career. You came up with an hypothesis and tested it by measuring some degree of change or difference between a control group and an experimental group. In the experimental group, you added a particular treatment that was not made to the control group. Through the measurements you made between the two groups, you were able to see if your hypothesis was correct or not. The scientific method is not only used by students and scientists, marketers use it frequently in the process of split testing, also known as A/B testing.
A/B testing is a way for businesses and individuals to identify and correct issues with key performance indicators (usually web conversion rates) by changing individual elements within an advertising medium and testing its effectiveness by subjecting each version (the original and the changed) to levels of traffic. By subjecting both versions to experimentation simultaneously and measuring the results, marketers are able to optimize websites and ad copy to be effective and appealing to users.
A/B Testing Helping out a Hypothetical Business in the Real World
John Dang owns Peninsula Grill, a restaurant which prides itself on its gourmet burgers and cupcakes. John wants to run a Google Adwords campaign to bring in traffic to his website and encourage new customers to patron Peninsula Grill. John is familiar with A/B testing, so he comes up with two versions of a Google Adwords campaign and tests out the two versions simultaneously using click-through-rates as his measured key performance indicator. John splits the traffic between the two campaigns 50/50 so that half of the traffic will be exposed to the first version of the campaign and half will be exposed to the second version. John runs this campaign for two days.
After the two-day campaign, John is able to see that version two of his Adwords campaign encourages more click-through-rates from viewers than version one. John suspects that this is due to the inclusion of the “Click Here!” text in the ad as well as the photo of his renowned cupcakes, which showcases the product, instead of the photo of the Grill’s facade. John picks version two as the ad copy to run in his Google Adwords campaign.
While John was smart to A/B test his Adwords campaign before initiating it, he made some crucial mistakes that reflect the fact that A/B testing is not without its flaws. You have to know how to properly execute an A/B test and know how to interpret the results in order to make it an effective tool. There are two major problems with John’s A/B test.
1. Multiple Changes Between Versions
In the two Adwords campaigns above, you can see that there are two changes between the versions: the photo and the text “Click Here!” included in the second version. In A/B testing, it is important to test only one element changed between variations. Why is this? It could be possible that the cupcake photo actually led to less click-throughs than the photo of Peninsula Grill, but the inclusion of the “Click Here!’ text increased click-through so much that the negative effects of the cupcake photo were offset and version two came out as the winner. By including the “Click Here!” text in the ad with the photo of Peninsula Grill, John would have a more effective campaign than either of the two he tested. By testing one element at a time, you can be sure that any measurable difference between the two variations is directly attributable to the changed element.
2. Length Of Test
John ran his A/B test for a mere two days. A/B testing is only reliable is the measurements it produces are statistically significant. To be statistically significant, an A/B test needs to have a decent sample size, or a decent amount of traffic. It is highly unlikely that John was able to achieve a necessary amount of traffic in two days to have the results of his A/B test be significant enough for him to make a decision as to which campaign to run.
A/B Testing Guidelines
- Test everything. From headlines, sign up forms, layouts, design styles, pricing, promotional offers, images, landing pages, text, buttons and colors – if you can change it, you should test it.
- Always test both variations simultaneously.
- Do a ton of A/B tests so that all positive results will add up to be a huge boost to your sales, conversions, click-throughs, etc.
- Do not design around what is appealing to marketers, design around what is appealing to customers.
- Know what you are going to measure and why.
- Know how long you need to test for your results to be significant.
Surprising Results of A/B Testing
A/B testing is so effective because it helps marketer’s look past what they think will be most effective and reveals what is actually the most effective. It lets the marketer enable the customer to decide what decisions will be made in terms of website design and ad placement instead of a group of executives who most likely do not share the same views as the customer or end user. Sometimes, A/B testing reveals some pretty surprising results. Here are a few of them, courtesy of Smashing Magazine :
- Along with its other A/B tests, CareLogger increased its conversion rate by 34% simply by changing the color of the sign-up button from green to red!
- The words “It’s free” increased the clicks on a sign-up button by 28%, illustrating the importance of testing call-to-action buttons and how minor changes can have surprisingly major results.
- Putting human photos on a website increases conversion rates by as much as double. Scientific research backs this up, saying that we are subconsciously attracted to images with people.