In the realm of online growth, an A/B testing program is your golden ticket to sustainable success. By experimenting with landing page variations, you tap into a goldmine of insights from your existing website traffic.
But here’s the catch: Many ecommerce store owners struggle with where to start in optimizing their online shops. If that sounds like you, read on—we’re about to explore six real-life A/B test success stories from top ecommerce brands.
Let’s jump right in!
A/B testing example #1: Bukvybag
Split testing homepage headlines led to a 45% increase in orders
Bukvybag was struggling with a low conversion rate on their homepage. They weren’t sure whether their headline was resonating with their target audience.
To address this problem, they used OptiMonk’s Dynamic Content feature on their headline. This helped them uncover the headlines that lured in the most customers.
Bukvybag’s original headline was “Versatile bags & accessories.” They wanted to test variations to try to boost their conversion rates. These were the headlines they experimented with, each of which focused on a different value proposition:
- Variant A: Stand out from the crowd with our fashion-forward and unique bags
- Variant B: Discover the ultimate travel companion that combines style and functionality
- Variant C: Premium quality bags designed for exploration and adventure
This split test strategy helped Bukvybag achieve statistically significant results and feel confident that their increased conversion rate wasn’t down to random chance.
In fact, the 45% upswing in orders they achieved thanks to A/B testing proved to be sustainable in the long run.
A/B testing example #2: Vegetology
Adding an above-the-fold testimonial to product pages achieved a 10.3% increase in unique purchases
Social proof is a key factor in driving sales, helping first-time website visitors feel confident in the quality of products that an unfamiliar store is offering. Research has found that 75% of consumers actively seek out reviews and testimonials before making a purchase.
Vegetology’s products have generated lots of amazing reviews, but they weren’t showing them off effectively. Their testimonials were buried at the bottom of their product pages, where nobody could see them.
It made sense for Vegetology to consider adding social proof to the above-the-fold section of the product page, but that’s a really big change that could also backfire. In a situation like this, the best thing to do is use A/B testing to collect data and find out for sure whether it’s the right move.
They used an OptiMonk Embedded Content campaign to create multiple versions of their product pages. They found that the challenger variant that included the customer testimonials at the top of the page led to a 6% increase in the conversion rate and a 10.3% increase in unique purchases.
Once Vegetology saw those results, they were able to roll out the change permanently and increase revenue over the long term.
A/B testing example #3: Varnish & Vine
Increased their revenue by 43% after multivariate testing their product pages
Varnish & Vine wanted to optimize their product pages in order to convert more traffic into paying customers. Unlike Vegetology’s customer testimonial split testing method, however, Varnish & Vine wanted to test multiple elements on their page at once.
OptiMonk’s Smart Product Page Optimizer allowed Varnish & Vine to conduct a multivariate test on a number of elements like their headlines, benefit lists, and calls-to-action. Unlike a traditional A/B test, they could test multiple page variables instead of just comparing changes to one element.
The AI-powered tool analyzed Varnish & Vine’s product pages and crafted captivating headlines, subheadlines, and lists of benefits for each product page automatically. These additions were designed to resonate with the target audience and boost conversions.
A/B testing example #4: Crown & Paw
Increased their email popup conversion rate by 2.5x
Crown & Paw was using a simple Klaviyo email popup to generate leads, but their marketing campaign had a poor conversion rate. They decided to try a multi-step popup in an attempt to collect more customer data and emails from their website visitors.
First, they tempted visitors with a good discount, plus the promise of personalized product recommendations.
Then, they asked a few simple questions to learn about their interests and preferences.
Finally, based on the answers provided by the ~95% of visitors who answered the questions, they recommended relevant products in the third step of the popup. They also displayed the discount code right there.
After A/B testing, they found that the multi-step popup had a conversion rate of 4.03%, which was a 2.5X increase compared to the simple Klaviyo email popup they had been using before.
A/B testing example #5: Obvi
Adding a countdown timer increased popup conversion rates by 7.97%
Obvi wanted to run tests to discover whether adding a countdown timer to their discount popup would increase conversion rates. They hypothesized that the countdown timer would increase the sense of urgency and result in higher coupon redemption rates, boosting revenues.
They created two versions of the discount popup, one with a timer and the other without. The variant with a countdown timer converted 7.97% better than the one without, indicating that the timer was effective at increasing urgency and conversions.
A/B testing example #6: Christopher Cloos
Testing a classic welcome popup against a conversational popup resulted in a 15.38% increase in conversion rates
A/B testing different types of campaigns is a way to discover which type resonates better with your target audience. This goes beyond testing two versions of the same web page or campaign, and involves trying out completely different approaches.
In this case, the Christopher Cloos team tested a classic welcome popup against a more personalized conversational popup. After reaching statistical significance by showing the different versions to a large enough sample size, they found that the conversational popup converted at a higher rate (15.38% higher, to be exact).
Each of these A/B testing examples has shown how controlled experiments can help you improve your homepage, landing pages, and marketing campaigns. The more you run tests, the more confident you can be that your website is performing optimally and making you the most money possible.
Once you have the right A/B testing tool, it’ll be easy to optimize each and every web page on your site.
Looking to make A/B testing effective and effortless? Try Smart A/B testing and automate 95% of your CRO efforts!