As marketers, we’re always struggling to figure out exactly how effective our actions are. Whether you’ve just launched a new landing page or an unconventional marketing campaign, you want to make sure that it has a positive impact on your revenue, right? (Or at the very least, that you’re not losing money!)
No matter how well you know your target audience, basing your marketing decisions on your gut instincts isn’t a good idea—you need real data. You need A/B testing.
In this article, we’ll share 7 easy-to-implement testing ideas that will help to grow your revenue.
Let’s get started!
Why do you need A/B testing?
A/B testing (also called split testing) is a way to get beyond your biases (yes, we all have them) and understand your marketing efforts in a scientific way. Often, it’s very hard to guess which version of a headline, value proposition, or product description will perform best. That’s why you need to test your ideas before committing to them.
Let’s use an example to help us understand why A/B testing is so important.
Imagine you’re working on a welcome popup for new website visitors with the goal of encouraging them to subscribe to your email marketing newsletter. To incentive sign ups, you offer a 10% discount for new subscribers.
A popup like this will have a solid conversion rate, which you’ll also see reflected in the campaign’s assisted revenue.
This is a good result that can positively impact your revenue, but the question you should be asking is whether the campaign is optimized to deliver maximum results.
The campaign might have the most success as an exit-intent popup with a teaser, or running it as an embedded form might increase conversions. You simply won’t know which is better until you run tests.
The lesson here is that you don’t know what changing elements of your campaigns will do to your conversion rates until you make the changes. That’s why A/B testing is a cornerstone of conversion optimization for any online store.
7 ideas on what to A/B test (with real-life examples)
Ready to see the results that ecommerce sites have achieved with A/B testing and get some inspiration that you can take to the drawing board?
Check out these real-life examples of different things you can A/B test in your campaigns…
1. A/B test your campaign’s design
One of the most commonly-tested aspects of marketing campaigns is the design. This includes things like variations in layout, colors, images, etc.
DTC brand Obvi ran a split test on two different variants of a discount popup: one with a countdown timer and one without. They wanted to see whether using a countdown timer would boost conversions by increasing the sense of urgency associated with their offer, or if it would lead to customers feeling pressured and rejecting the offer.
They found that the version with the countdown timer increased conversion rates by 7.97%. All those extra sales were a result of A/B testing different versions of the campaign!
When you’re testing ideas for your campaigns, keep in mind that you should only make one change at a time. Otherwise, you won’t know which change is responsible for any increases or decreases in your conversion rate.
As you can see above, the only difference between the two campaigns in Obvi’s split test was the countdown timer, which meant they could be confident that it was the element making a difference.
Testing out multiple different elements is possible if you run multivariate testing, which allows you to test more than two variants at once. However, many ecommerce sites simply test out a new element every few weeks, making conversion optimization a continuous process.
You can easily carry out both A/B testing and multivariate testing with OptiMonk’s Variant A/B Testing feature. Here’s a step-by-step guide on how to set it up.
2. A/B test your call-to-action button
Your call-to-action (CTA) is the core of any campaign or landing page, since it’s the element that actually encourages users to complete your desired action. In order to ensure you’re achieving the best possible conversion rate, it’s essential to split test multiple call-to-action variants.
It’s worth testing out many different aspects of your CTA button in order to boost conversions. For instance, you can experiment with:
- CTA copy
- Size of the button
- Color of the button
- Location of the button
- Button vs hyperlink
- Using emojis on your CTA button
Each of these changes to your call-to-action button can lead to more conversions.
It’s also a good idea to test your CTA button specifically for mobile users.
3. A/B test your offer
If you’re struggling with low landing page conversions or a disappointing popup campaign, it might take more than a redesign to fix the problem. In fact, your offer itself might not be convincing users to move through to your checkout page.
You can try A/B testing different offers to discover which ones perform best for your target customers. There are many different testing ideas you could use here, including investigating whether different discount types (e.g. 10% off, $10 off, offering free shipping, or a mystery discount) will boost conversions.
If you’re trying to generate more leads, it might be a good idea to offer a lead magnet like a free ebook or a free trial period. Although these are non-monetary offers, they tend to boost conversions for early-funnel visitors who aren’t yet ready to make a purchase.
4. A/B test the effectiveness of teasers
Teasers are a way of “previewing” your full campaign without interrupting your visitors’ browsing experience. Instead of showing the full popup, you show a teaser that only reveals the popup when it’s clicked (and if abandoning visitors haven’t clicked on the teaser, the full campaign will display on exit intent).
Obvi tested out whether their Black Friday popup generated more conversions as a teaser campaign or as a traditional popup.
After running a one-week test, they saw that the teaser variant’s conversion rate was 36% higher.
Once again, Obvi used OptiMonk’s Variant A/B Testing feature to split test these two versions of the marketing campaign.
While teasers were really effective for this campaign on this page, they won’t necessarily work the same way for every campaign and for every target audience. That’s why we suggest testing them for your own website and landing pages.
5. A/B test different types of campaigns
You have a lot of different options for how you deliver your messages and offers to visitors. For example, you might love your lucky wheel lead magnet popup but wonder whether it’s working better than a simple email list-building popup would. Once again, our only advice is to test!
In fact, you can run split tests on any campaign type against any other. Here are some examples of campaign types that are worth comparing for almost any ecommerce store:
- Popup vs. embedded message
- Popup vs. sidemessage
- Popup vs. fullscreen
- Gamification template vs. standard list-builder
- Conversational popup vs. welcome popup
Christopher Cloos, an online store specializing in designer sunglasses, used split testing to find out whether a classic welcome popup or a personalized, conversational multi-step form would result in a better conversion rate.
After a split testing period that lasted one month, they found that the conversational popup converted at a 15.38% higher rate than the classic welcome popup.
To compare the results of two different types of campaigns, you’ll need to use OptiMonk’s Experiments feature.
If you’d like to learn more about Experiments, check out this explainer video:
6. A/B test different triggers
When your campaigns appear can be just as important as what they say. If you’re not sure whether to use an exit-intent trigger, a scroll-based trigger, or a time-based trigger for a campaign, you can easily find out with OptiMonk’s Experiments feature.
For example, you could test whether your email newsletter popup performs better by:
- Showing the popup 5 seconds after website visitors land on your site, or
- Showing a teaser after 5 seconds and showing the popup on exit intent
You might find that you generate more leads with a campaign that takes longer to show up, or maybe the attention-grabbing power of an unexpected popup works better.
7. A/B test the headlines of your landing pages
Finally, you can test headlines on your landing pages using the Embedded Content and Variant A/B Testing features.
A landing page needs to quickly and clearly communicate your unique selling propositions to your visitors. A/B testing your headline copy can help you be confident about what type of headline resonates best with your target audience.
You can also test out different product descriptions or social proof by changing up your embedded content.
Here’s a step-by-step guide that will help you get started with testing out different headlines on your landing pages.
Does A/B testing actually work?
A/B testing is a proven strategy for improving your website performance, there’s no question about it. However, you have to be careful about implementing changes based on split tests. That’s because, according to AppSumo research, only 25% of A/B tests actually produce significant results. Make sure you’re running yours for long enough to collect meaningful results.
What is a good sample size for A/B testing?
There isn’t one answer that fits every single company and every experiment you’ll ever run. It all depends on how much traffic you’re getting. But if your website has less than 100 visitors a day, A/B testing might not be the best strategy for you.
Why conduct an A/B test?
Even small changes can have a huge impact on your conversion rates. An A/B test can help you decide — based on data rather than gut feelings — which variant performs better. Thus, it can be hugely helpful for growing your sales.
What’s the difference between A/B testing, multivariate testing, and split testing?
When you A/B test (or split test), you compare just two versions of a campaign at a time. In contrast, with multivariate tests, you’re testing multiple versions at once.
A/B testing is essential to guide your marketing investments. Whether you’re investing your time, money, or both, you need to make sure that they’re worth it.
Luckily, the right software makes A/B testing a breeze. Before you know it, you’ll be in the habit of always finding out what’s working and what you should improve. Your customers will notice the difference and you’ll definitely notice more sales.
OptiMonk’s Experiments feature can be a vital tool for optimizing your campaigns and maximizing your marketing ROI. Try Experiments Now!