A/B Testing: A Proven Conversion Rate Optimization Tool

August 8, 2018

Imagine building a marketing strategy based solely on your instincts — no tests, no research, no customer profile, nothing. Imagine simply intuiting that yes, this would work. It sounds like a recipe for disaster.

Businessman pointing at chart and graph in digital tablet

Or is it? We acknowledge that, sometimes, it’s healthy to follow one’s hunches. But in certain crucial aspects, such as marketing, it’s wise to pool the numbers and anchor decisions on solid research. Data helps make your business choices lean toward success, after all.


At Aspire Digital Marketing, research is the cornerstone of a great marketing strategy. As a leading digital marketing agency in Neptune, New Jersey, we employ buyer persona, keyword research, and advanced analytics to create and evaluate SEO strategies. And we want to share how important these tests are to the success of our clients.


One method that many businesses aren’t familiar with is A/B Testing. This experiment compares two versions of your website to discover the best way to convert visitors to customers. Let’s discuss this further.


The Rationale: Why A/B Testing?


One of your website’s goals is to encourage people to convert, meaning, purchase a product, request a quote, or subscribe to a newsletter. After all, not all of your visitors are determined to conduct a transaction. They need a little push in the right direction. The elements that make up your website — the text, graphics, layout — each play a part in swaying a customer’s decision.


Digital marketers, therefore, set their sights on conversion rate optimization (CRO). This refers to increasing the percentage of visitors who convert. Marketers run various tests to determine which tactics propel your conversion rate. And at the helm is A/B testing.

​​​

How Does A/B Testing Work?


Also known as split testing, A/B testing compares different versions of a campaign to see which produces the highest conversion rate. You can perform an A/B testing for websites, ads, and emails.


Let’s say you have two versions of your website, A and B. Half of your viewers are directed to version A, while half goes to version B. You then identify and use whichever registers a higher conversion rate.


Don’t be fooled by its simplicity, though. Small changes to a website can affect the conversion rate dramatically. In fact, a study by HubSpot found that 21% more people clicked a red CTA button than a green button.


Preparing an A/B Testing


You can test various elements of your website, including the headline, subheadline, body text, CTA, images, testimonials, and more. You can also do the same with emails and ads. The subject line, sender details, email text, and CTAs of your emails are points of interest. Meanwhile, you can test your ad’s headline, body text, links, and keywords.


A lot, therefore, goes into the seemingly simple changes involved in the A/B testing process.


  1. Choose the variable


The first step is to study your website and visitors. Use analytics tools to spot webpage problems. Pair these findings with user behavior reports from heatmaps, visitor data, and form analyses. These help you choose the right variable to test.

Let’s say that, from analytics tools, you found that the homepage has a high bounce rate. And from user behavior reports, you discover that visitors rarely scroll to the bottom of the page, where the CTA button is located.


  1. Make a hypothesis


From the data you have, construct a theory that leads to CRO. For example, the hypothesis for the case is “Placing the CTA at the top of the homepage would increase conversions.”


  1. Set the control and challenger


Create a variation of the homepage based on your hypothesis. In this case, the variation is a web page with a CTA on top. You now have two versions of your website — the original (A), which would be the control, and the variation (B), which would be the challenger.


  1. Set parameters


When testing two versions, halving the visitors is the way to go. Fifty percent would see homepage A, while the rest would be redirected to homepage B. Determine the ideal duration of your testing, too. Reliable analytics can help you compute the time it takes to get a substantial number of views.


Lastly, be prepared to compute the statistical significance of the result. This checks if the change in conversion rates is because of the change in the CTA button’s position and not chance alone. You can set the confidence level at 95% if it’s a large experiment. Otherwise, you can set it lower.


  1. Test the hypothesis


Conduct only one A/B test at a time. If you want to experiment on two versions of a headline, too, set it for another day. This way, you can attribute the results solely to the change in the CTA button’s location.


Timing can skew the results, so run the variations simultaneously. Don’t run version A this week, then version B on the next. They should be live at the same time and receive half the traffic each. Follow the ideal duration, so you yield meaningful results.


  1. Analyze the data


Compute A and B’s conversion rates and check if the difference between them is statistically significant. If it is, then your test has a winner. If not, then you can conclude that version B doesn’t affect the conversion rate significantly. You can either stick with version A or run another test.


  1. Report the results


Let’s say the version B produced a significantly higher conversion rate than version A. Report this data to the concerned teams like marketing and web development. They can create an action plan based on the results.


Multivariate Testing


In our example, we only tested one variable — the location of the CTA button. In some cases, however, you might want to test two or more variables at the same time (like with our example in number five). The A/B testing method can’t handle this. You would need to implement a multivariate testing approach, which determines the combination of variations leads to the highest conversion rate, to get the results you want.


For instance, you want to test 1) the location of the CTA button on the homepage and 2) the homepage’s headline. You test four different versions of the homepage:


  • Version A: CTA at the bottom + old headline


  • Version B: CTA at the top + new headline


  • Version C: CTA at the bottom + new headline


  • Version D: CTA at the top + old headline


The test is possible but naturally more complicated and time-intensive. The good news is that it can produce data that could significantly change your conversion rates.


Keeping SEO Afloat During A/B Testing


Google gives A/B Testing a green light, provided that companies follow certain guidelines:


First, no cloaking. This refers to showing one set of web content to visitors and another to the search engine. Show the original content, aka the control or version A, to Google.


Second, use a 302 redirect, not a 301. A 302 redirect tells the search engine that the redirect is temporary, and users would no longer need it once the experiment is over. A 301, on the other hand, tells the search engine that the redirect is permanent.


Third, don’t run the test longer than necessary. The Aspire Digital Marketing team, for instance, uses reliable tools to determine the ideal duration of the experiment.


An A/B testing yields data that can make or break your website’s performance. Aspire Digital Marketing makes the most of the latest analytics tools to test different variables and help you achieve your business goals. Interested in A/B testing? Talk to our team today.