HP Tech@Work

Today's trends for tomorrow's business
What is A/B Testing and Does it Work in SMB Marketing?

What is A/B Testing and Does it Work in SMB Marketing?

Tom Gerencer
|
Reading time: 8 minutes
If you don’t A/B test, you may be leaving your marketing strategy to chance. Seasonal shifts, data-collection flukes, and one-time spikes can steer your campaigns into a morass of weak conversions. But a massive return on investment (ROI) could be just around the corner, and A/B testing can help you find it fast.
A/B testing is like a GPS app for your marketing campaigns. It shows the terrain, the shortest routes, what’s working, and what’s not. A/B tests can chart your course to success, and the best part is that it’s not rocket science.
Follow our simple steps below, and you’ll get data on your side and quickly start to ratchet up your ROI. You’ll see how to A/B test with websites, emails, tweets, and more in our quick how-to guide. But first, a few basics.

What is A/B testing?

A/B testing is a way to test two different versions of a web page, form, email, or tweet. You then gather data, analyze it, and choose the best version to roll out to the masses. Also known as split testing, it creates a data-based approach that eliminates guesswork from your marketing campaign.
Building your conversion rate and click-through rate can take time, and blundering in the dark is all too common for many businesses. But A/B testing can chop that timeline into snips and run several of them simultaneously, creating a much shorter pathway to success.

How A/B testing works

A/B testing sounds complex, but it’s easy once you know the steps and have the right tools. It works by:
  1. Know your current traffic data and stats (research),
  2. Come up with some theories about what the data means (hypothesis),
  3. Create As and Bs to try (variations),
  4. Test them (run the test).
  5. Finally, you’ll analyze the results and deploy the one that works (analysis and deployment)
A key part of the last phase is understanding statistical relevance. That simply means, “What’s the chance the difference is just luck?” It sounds simple, but too many testers misunderstand it – at their peril.
If you take the time to master these simple steps, you’ll gain valuable insights to ramp up your conversions.

Why A/B testing is important

In the business book Driven, Joel Litman, head of Valens research, describes the route to business success as a bolt of lightning. Before the main bolt strikes, several feelers hit the ground, searching for the quickest path. The main bolt strikes only after that’s done first.
In a similar way, any business effort should try small and fail small, hunting for the successful path, then investing in that path that points to sucess. A/B testing is often the fastest way to understand user behavior so you can increase conversions, improve user experience, and get more website visitors to convert to buyers.
Meeting conversion goals is broken down into other benefits that make this a valuable tool:
  • Solve pain points: A/B testing is a data-driven way to uncover customer pain points and improve user experience as visitors navigate your website and other marketing materials.
  • Lower bounce rate: Conducting A/B tests is a clean way to cut your bounce rate to stop losing potential clients.
  • Increase ROI from existing traffic: Find out where and why your calls to action (CTAs) and other marketing techniques are pushing traffic from your sales funnel before they can convert.
  • Make low-risk changes: Betting the farm on a new redesign is a huge risk. A/B testing can foretell success before a new effort is set in stone.
  • Faster traffic gains: Find the shortest distance from point A to B faster, without the guesswork.

How to run an A/B test

Most of us understand the principle of A/B testing. But applying what we understand can lead to a plethora of questions. Thankfully, it’s not difficult to test your websites, blog posts, calls to action, button colors, emails, and other marketing elements, if you follow the six steps below.
Let’s take a look at each of these steps in detail:
  • Running an A/B test starts with collecting data on current performance (where are we now?)
  • Next, form a hypothesis (why is our performance like that?)
  • Create a variation (what may work better?)
  • Run the test (let’s try it)
  • Collect the new data (how did it work?)
  • Pay attention to statistical variance (does the result matter?)

1. Collect data

You can’t test what you don’t understand, so you need to start by collecting data. Gather stats on users, open rates, click-through rates, conversion rates. Note which pages, emails, or tweets get the most attention and which ones get the least. This is the time to take note of every metric you can get your digits on.
Pro tip: Use heatmaps, session recording, and the other A/B testing tools that we provide at the bottom of this article.

2. Create a hypothesis

Data without analysis is just a lot of numbers. In this step, it’s time to look at the data, step back from it, and draw conclusions. You may be right or you may not, but your insights are what shapes the starting point for A/B testing. In the next step, you’ll start to see how right (or wrong) you are, and reap the benefits.

3. Create a variation

Create a different version of your asset; website, CTA, email, text, and so on based on your hypothesis. For example, if you’re testing a CTA button, your hypothesis may be that it’s too small. Your variation would be to try a larger size.

4. Run the test

When it’s time to run the test, choose a sample size and – if you like – an A/B testing tool. A tool will help you send your test to a preselected number of users. If you’re testing a new version of a web page, the tool will display your test page to a certain number of users over a period of time, collecting the results.

5. Analyze the results

Once you’ve run your test, look at the results. Did method A (your original) or B (your test) work better? Pay close attention to statistical significance. That’s where you make sure your results aren’t a fluke created by statistical noise.

6. Deploy

Once you’ve gathered data, created a hypothesis, created a variation, run the test, and analyzed the results, it’s time to deploy the winner. If your original worked best, you’re done. No changes are necessary. If your variation won the contest, you’ll switch to it.
But simply A/B testing one Twitter image or one email subject line at a time can be a time-consuming process. That’s why most marketing efforts use multivariate testing and/or multipage testing.

Types of A/B testing

A/B Testing
There are a few different types of A/B testing: split testing, multivariate testing, and multipage testing.

1. Split testing

This is the standard form of A/B testing, where you start with an existing web page, email, button, or other feature, then try a different version. The terms “A/B split testing” and “A/B testing” are used interchangeably.

2. Multivariate testing

This is just like split testing, but it tests more than one variation at a time. With split testing, you may start with a green CTA button, and try a red version. But with multivariate testing, your test may also try orange, yellow, and blue. You can think of multivariate testing as A/B/C/D/E testing or A/n testing.

3. Multipage testing

This is multivariate testing with a website-specific thrust. With multipage testing, you may run A/B tests on different page text, images, CTAs, buttons, and other features on several different pages, all at once. It’s complex, and most often requires the use of a website-specific A/B testing tool.

Features you can A/B test

Marketing pros use A/B testing to ferret out the best versions of almost any feature and facet of campaigns on multiple platforms, including web pages, apps, social media posts, and emails. You can A/B test:
  • Landing pages
  • Article headlines
  • Subheadings
  • Content
  • Writing styles
  • Paragraph length
  • Images
  • CTAs
  • Buttons
  • Email subject lines
  • Design
  • Layout
  • Navigation
  • Forms
  • Testimonials
  • Content length
In fact, it’s smart to A/B test any and all elements of your campaigns.

A/B testing tools

Meticulously A/B testing all the minutiae of a marketing campaign can be laborious. Most pros use an A/B testing tool to smooth the process. Some of the best are:
  • HubSpot A/B Testing Kit: Download an A/B testing kit from HubSpot and Kissmetrics.
  • Optimizely: Run experiments like a seasoned pro with an extensive platform.
  • VWO: Used by thousands of brands worldwide for apps, websites, and products.
  • Crazy Egg: Easy-to-use A/B testing tool with a 30-day free trial.
  • Omniconvert: An advanced segmentation engine with over 40 test parameters.
Also check out the excellent A/B testing book, Trustworthy Online Controlled Experiments, by Ron Kohavi, Diane Tang, and Ya Xu. (There are a few other great tech books here.)

Pitfalls of A/B testing

An important part of knowing how to A/B test is knowing where the pitfalls are. Be prepared with knowledge of a few key problems.
  • Ignoring statistical significance: Is your result really significant? Avoid false positives by setting a high significance level and running more tests with larger sample sizes.
  • Choosing too-small sample sizes: A small sample size will lower your chance of finding true results. It will also raise the chance of false positives.
  • Monitoring tests: Watching a test in progress can skew the results. That’s because too many marketers stop the test when they see a result they like. Instead, choose a pre-set sample size and duration, then stick to it.
  • Changing the rules during a test: It’s tempting to change traffic allocation, test duration, sample size, or other factors while a test is in progress. Unfortunately, this can introduce bias and scuttle your attempts at gaining insight.
  • Ignoring differences between tests: An ecommerce test run right before Black Friday will have very different results than one run in mid-July. To compare apples to apples; run tests during similar time periods or adjust for differences.

Summary

A/B testing is a data-driven way to know for certain whether one feature of a tweet, email, or web page works better than another. It can seem complex, but following the steps above can take away the guesswork. Using an A/B testing tool is often the quickest way to make this marketing technique work for you, and it’s a great way to turn a ho-hum marketing campaign into a real winner.
About the Author: Tom Gerencer is a contributing writer for HP Tech@Work. Tom is an ASJA journalist, career expert at Zety.com, and a regular contributor to Boys' Life and Scouting magazines. His work is featured in Costco Connection, FastCompany, and many more.

Disclosure: Our site may get a share of revenue from the sale of the products featured on this page.