Facebook offers so many ways to test the performance of your ads before they go live. One of the most popular tools Facebook offers is A/B testing.
A/B testing, or split testing, is a term used to describe the process of running marketing experiments to see which version connects better with your audience. Usually, they’re tested simultaneously, and the variables can be anything from layout, copy, or multimedia.
A/B testing is a very popular marketing method because it gives marketers an idea of what types of ads or UX visuals earn the highest conversion rates. Essentially, if you run an A/B test, you can begin to identify the performance of a piece of content before publishing.
Here, let’s dive into how you can A/B test your marketing ads on Facebook.
On Facebook, you can create A/B tests in multiple ways. This is dependent on the variable you want to test.
Fortunately, you can create an A/B test within the Ads Manager Toolbar. The Toolbar will let you use an ad campaign you’ve already created as a jumping off point for your new test.
Toolbar isn’t your only option, though — in the next section, we’ll cover all the ways you can A/B test your ads.
When A/B testing on Facebook, you can either access the Toolbar from your Ads Manager, duplicate a campaign or ad set, or use the Experiments tool.
First, let’s go over the Toolbar method.
Toolbar lets you quickly perform a test using a dropdown box located in Ads Manager. Here’s how:
When you access Ads Manager, go to the “Campaigns” tab. Under that tab, you’ll see an option for “A/B Test.” Keep in mind that you’ll need to have an existing ad campaign or campaign draft in order to complete the test.
When you choose that option, this is what you’ll see if you don’t have an existing campaign:
Select your desired campaign, and then you can choose which variable you want to test.
There are several different options for variable testing, and they’re categorized based on the goals of your campaign.
The variables are as follows:
All of these variables can be managed from the “Ad Set” tab in Ads Manager, which is right next to the “Campaigns” tab.
Once you’ve chosen your test type, you’ll be ready to perform your test. You can check the status of your test in Ads Manager, and choose how long you want your test to run. To find the progress or check the status of your ads, check your “Account Overview” tab, and look for the icon that resembles a beaker:
If you find that you want to go with a different route for your A/B test, there are also options to set them up differently. For instance, let’s talk about duplication next.
When you choose this option, you can easily create a test by changing one variable in a nearly identical campaign or ad set. This is for ads or campaigns that have already been created.
When you go to Ads Manager, go to your “Campaigns” tab. Here, you’ll see a list of your campaigns that are currently running. You’ll also see your drafted campaigns. You can choose either for duplication.
After you’ve decided which campaign or ad set you want to test, highlight the section under the title and you’ll see a “Duplicate” option. When you click it, this is what you’ll see:
Select the option that notates creating a duplicate specifically for an A/B Test. Remember, this option will let you choose a variable to change to analyze performance, so choose a campaign that fits that criteria.
If you’re choosing an ad set to duplicate, Facebook will provide suggestions of which variable to change for you, and you can pick from there. I chose traffic, but you can choose based on your ad type or audience.
After choosing your variable, you’ll see your tests next to each other in a preview. After making any necessary changes in this stage, you’ll be ready to publish. To do this, click the green button underneath the audience you’ve selected:
When you publish your test, audiences will be able to interact with them, so make sure you’ve ironed out all the details before finalizing. However, you’ll be able to check back on your test in Ads Manager to access the most current insights.
Next, we’ll cover my favorite option: Experiments.
The Experiments tool lets you create or duplicate ad campaigns to test. The difference between using Experiments to test instead of Ads Manager is the ability to fine-tune and learn more about the impact of your test while it’s running.
This test won’t run in Ad Manager. Instead, you’ll go to the top of your Business account and select “Experiments” under “Measure & Report.” You can also search “Experiments” in the search bar. This is what you’ll see when you access the Experiments page:
Click “Get Started” underneath the “A/B Test” option. When you do this, you’ll be taken to a menu that lets you fill in the ad details. For example, you’ll have to choose the campaign you want to test:
Here, you can schedule the run time of your test, fill in the test name, and even decide how you want Facebook to choose the winning campaign. You can either choose cost per result or cost per conversion lift:
When you’re finished filling in the details, Facebook will show you how powerful your test is. Essentially, this is to make sure that your draft fits the criteria of an A/B test before you publish. After filling out this menu, you’ll be ready to push your experiment live.
Facebook uses the same base technology to run your A/B tests. The different versions of tests you can run help you make the best choices to optimize ad performance. No matter which test you run, however, you can see all of your results in “Test and Learn” when they’re finished.
Next, let’s go over some best practices for running your A/B test on Facebook.
Keep these best practices in mind before you begin your split test — they’ll help you run tests that are valuable and applicable to your next campaign.
When you A/B test on Facebook, make sure you’re only choosing one variable to test. There’s a separate multivariate test that you can run, but for A/B, one variable is key. Your test results will be more conclusive with only one variable.
Choose a new audience for your test. They should be large enough to provide measurable results, but shouldn’t be the exact same audience as a campaign you’re already running. If they’re the same as a drafted campaign, that’s okay because they’re not published.
However, if you choose the exact same audience as a campaign you’re already running, Facebook’s system might mix up your ads and provide contaminated results.
In order to analyze your test results so they’re the most valuable to you, make sure your hypothesis is measurable. To put it another way: Make sure your hypothesis is clear, easy to understand, and able to be determined with an A/B test.
Your hypothesis can be as simple as, “Which method of delivery do my audience members respond the best to?” This question can be answered by using the Delivery Optimization A/B test on Facebook.
Recall that when you set up your A/B test, you can choose a time frame. You can choose to run your test for up to 30 days. Facebook’s Business Center suggests at least four days, which is enough time for the technology to produce accurate results.
Facebook can provide an ideal budget for you based on your test details, or you can choose an ideal budget for yourself when you’re filling in test details. Setting an ideal budget will help you determine a winning strategy — it factors in Ad Spend into the success of your test.
According to one of HubSpot’s Paid Ads specialists, Nicole Ondracek, “A big value of split testing is being able to prevent audience overlap so you know that the same audience is not seeing multiple variants which could affect the results. That way, you can confidently say which one is the clear winner.”
A/B testing gives you a better understanding of audience behavior. Performing them on Facebook streamlines the process and gives you more comfortability with Facebook’s ad system.
Additionally, Ondracek mentions that depending on split testing results, advertisers can begin to shape what type of creative they need to use for the future.
How will you use split testing on Facebook to help your creative advertising efforts?
Originally published May 13, 2020 4:15:00 PM, updated May 13 2020