A/B Testing Examples: A Deep Dive on Hypotheses and Results

It’s easy to get caught in the trap of defaulting to best practices to improve your conversion rates. But think again—best practices don’t always get the best results.

This is why A/B testing is an essential step for maximizing customer experience and conversion rate optimization (CRO). We’d even say that it’s one of the three essential elements of a successful ecommerce business, right alongside strategic ad alignment and understanding user behavior.

Ready to test your assumptions but not sure where to start? We’ll dive into the specifics of why A/B testing is so essential and share some A/B testing examples from successful ecommerce brands. 

Why A/B testing is crucial for your ecommerce success

It’s easy to become so invested in your own website and CRO strategy that you’re no longer able to view problems through an objective lens. Take this navigation bar, for example:

If this was your site, your first instinct might be to crown the original version as the winner. After all, it features a prominent call-to-action button linked to the Bundles page. 

But A/B testing proved this assumption wrong. The variant version increased clicks to the Bundles page by 130%.

Shocking, right? It turns out that the giant button was a little too visible and resulted in banner blindness, which is when an element is so visible that users skip over it. There would be no way to know this (and prove it) without A/B testing the two versions.

This is why A/B testing is so critical. It removes the guesswork in your digital marketing strategy so you can implement data-driven solutions that increase conversions, search engine optimization performance, and the number of website visitors you see each week.

The importance of baseline quantitative and qualitative research

Without quantitative and qualitative research, your A/B tests are akin to throwing spaghetti at a wall to see what sticks. Successful A/B tests rely on data. Companies that incorporate analytics into their testing framework outperformed those that didn’t by 32% per test, while companies that used analytics and heatmapping outperformed their counterparts by 48%.

Think of it this way: 

  • Quantitative data tells you what to test.
  • Qualitative data tells you why you should test it.

Using both types of data to analyze your results is essential. 

Analytics can tell you that your customers spend more time on your variant product landing pages after a redesign, but they can’t tell you why. You need qualitative data, like heatmaps and usability tests, to know whether the new version is truly a winner and to avoid validity errors or other A/B testing mistakes.

SplitBase's Testing Trifecta

This is why we developed our Testing Trifecta, where analytics (quantitative data) and human input (qualitative data) form the foundation for A/B testing and growth. This larger framework helps us develop hypotheses to test so that even if a particular A/B test doesn’t win, it still offers valuable insights.

How to develop a hypothesis

We use our Testing Trifecta to develop a hypothesis, starting with qualitative data to pinpoint where a problem might occur.

1. Gather quantitative data to determine what the problem is

Dig into your sales funnel to spot potential problems. Check Google Analytics for data on bounce rate, click-through rate, conversions, and time on page. Pull results from heatmaps, click maps, and scroll maps to pinpoint where customers encounter problems.

For example, if your scroll maps don’t show that most users scroll past the fold on your product landing page or even reach your Add to Cart button, your hero section may be discouraging them from continuing to scroll.

Other types of problems may involve metrics like:

2. Gather qualitative data to determine why the problem occurs

Once quantitative data tells you what the problem is, you need qualitative data to tell you why it’s happening.

Direct conversations with your customers are always the best source of qualitative data. Common methods include customer interviews and surveys, as well as reviewing customer service chat logs. We like to send out post-purchase surveys or trigger a Hotjar survey after a customer has spent a certain amount of time on a product page.

3. Form your data-based hypothesis

Now that you’ve gathered both qualitative and quantitative data, you can form hypotheses to test potential solutions for the problems and pain points you identified.

For example, quantitative data told beauty brand Laura Geller that its current paid customer acquisition strategy wasn’t converting at a rate high enough to meet business goals. Using our Testing Trifecta approach, we added context to this problem by sending out customer surveys and polls.

This combination of data told us that the brand’s audience was older and had a lot of unanswered questions. So we hypothesized that a landing page with more detailed information and answers to common questions would outperform a landing page with less information. After testing different versions, the winning landing page increased conversion rates by over 43%.

5 phenomenal A/B testing examples from DTC ecommerce brands

These five case studies showcase the important role that data plays in A/B testing. Word of warning: these tests may not work for your brand, do not copy and test them, as that would go against the entire Testing Trifecta approach. Use these as inspiration, to understand the important and power of testing, but find your own test ideas using the method described earlier in order to achieve best results.

1. MoonPod boosted revenue by approximately $3.3M annually

MoonPod’s flagship product is its anti-anxiety bean bag chair, and part of its product promotion strategy involves paid ads that lead to custom landing pages. The company sought to improve its landing page performance and conversion rate, so it partnered with SplitBase. 

After digging through Google Analytics, surveying existing customers and on-site visitors, analyzing heatmaps, scrollmaps, and usability tests, and checking paid ad performance, SplitBase landed on three testing priorities for MoonPod:

  1. Increase AOV by surfacing upsells, cross-sells, and bundles during checkout: We tested variations of bundles, upsells, and cross-sells with MoonPod’s primary product and accessories.
  2. Address customer questions on product pages: Surveys identified customers’ biggest questions about MoonPod’s products, and we aimed to improve the user experience by answering those questions on product pages.
  3. Customize landing page headlines based on customer pain points: Understanding what motivates customers to buy helped us test multiple types of messaging.

This resulted in higher conversion rates and AOV while reducing MoonPod’s customer acquisition costs. When combined, these results added an estimated $3.3M in additional annual revenue.

2. Dressipi boosted CTA clicks by 124%

UK clothes-shopping service Dressipi used customer data to test a new headline and CTA. (Source)

Sometimes, you need to make more than one change to see results, which is what happened when UK fashion service Dressipi decided to A/B test different variations of its landing page’s hero section, which included a headline and CTA button.

Dressipi spiced up its standard headline with one that employed the casual language its customers used. But the results didn’t see conversion numbers budge. 

So the team went back to step one and hypothesized another variation to test: a version with a new CTA button that better matched the “risky,” colloquial tone found in the new headline. This new CTA, combined with the reworked headline, saw 124% more clicks.

3. SiO Beauty increased subscription revenue by 16.5% with an upgrade-to-subscription feature

Beauty brand SiO increased subscriptions by adding an “upgrade to subscription & save 10%” button in their side-cart. (Source)

For many DTC brands, increasing subscriptions is an ever-important goal, and SiO Beauty was no exception. Using ReBuy, the brand tested adding a new button to their side cart that allowed people to upgrade single products into a recurring order. 

This led to a subscription revenue increase of 16.5%, and is a good reminder that side-cart features such as product recommendations, upsells, and cross-sells, are great elements to test.

4. SmartWool increased revenue by 17.1% with a collection page test

Ecommerce brands, their marketing teams and designers love to innovate and try new designs that are perceived as “better”. But without testing, how do we know if they are truly better?

In this test, SmartWool used a nontraditional collection page layout, which featured 4 products next to 1 featured product. The question of whether this nontraditional layout was better than a traditional collection page format remained, so it was tested.

Blue Acorn tested different collection page layouts for SmartWool. (Source)

After testing over 25k visitors and reaching a statistical confidence of 95%, the traditional collection page (Variation B) won and increased revenue per user by 17.1% against the control according to the CXL blog.

The lesson? It’s not because a design is perceived as better, more modern, or more innovative that it’s going to be producing more money. 

5. Hyperice increased conversion rate by 6.2%

This test was part of a full ecommerce Conversion Optimization Program conducted by SplitBase for Hyperice, which led to over $934,000 in additional revenue per month through experimentation.

Hyperice, the leader in high-performance wellness and recovery technology were looking to increase conversion rates through conversion optimization. Their goal was to do so by making their website more useful to customers, and remove any potential sources of confusion that might be present in the funnel.

Following the Testing Trifecta approach, it was discovered through qualitative research - mainly through website polls - that users had difficulty understanding the differences between each Hypervolt model.

Session recordings and analytics showed that visitors who were interested in the Hypervolt product would also bounce from product page to product page for each of the different models before making their decision.

In order to reduce friction and make it easier for users to find the model that’s best for their needs, we hypothesized that adding a product comparison feature on collection pages would improve conversions.

After 3 full weeks of testing and reaching the required sample sizes, the test’s variant with the product comparison feature successfully increased conversion rate by 6.2%.

Leveraging A/B test results for long-term gains

A/B testing isn’t a one-and-done strategy. Successful companies aim for continuous testing while scaling their resources to match. You can see the impact of this relentless approach to testing when you look at the top 10,000 websites based on traffic: 33% of them run A/B tests

On the other hand, less than 0.6% of all active websites run A/B tests. Why would you pass up this prime opportunity to optimize your Shopify site or ecommerce store, gain more conversions, and delight your target audience all at once?

Optimizely found that the top 5% of tests are responsible for 50% of the impact on metrics that positively affect business growth. This includes improving your customer experience, which directly correlates with increases in metrics like customer lifetime value and retention rate.

Start your A/B testing journey with SplitBase

If your ecommerce brand relies on landing pages, A/B testing is a must. Even the best-performing landing page can improve—so you should view each version of your landing page as an experiment.

If you’re not sure where to start or lack the resources to A/B test on your own, SplitBase can help. We specialize in full-site A/B testing for ecommerce brands, so we can not only help you decide what to test but can also interpret your results and implement winning ideas.

Ready to see how A/B testing can drive revenue for your business? Request your free proposal today.