Shopify A/B Testing: A Step-by-Step Guide & Best Practices

A/B testing is an essential component of a conversion rate optimization (CRO) strategy. These experiments provide insights into user behavior and help validate website changes, enabling direct-to-consumer brands to keep up with evolving customer preferences and demands.

On average, ecommerce brands perform between 24 and 60 A/B tests every year. But how do you determine which are the best tests to run? And how can you make sure they’re effective?

This guide will explore how to implement A/B testing and what to test for your Shopify store. We’ll also provide five tips to help you get the most out of your A/B testing strategy. 

What is A/B testing? 

A/B testing—also known as split testing—compares two or more versions of a web page or landing page. At minimum, you’d create a control version (Version A) and a variant version (Version B). You’d then show these variants to two randomized groups of customers simultaneously to determine which one works best and make website optimizations accordingly. 

When it’s part of a comprehensive CRO strategy, A/B testing helps identify and validate site changes that have the following benefits: 

  • Increased conversions: CRO aims to get more of your ideal customers to follow through with making a purchase when they visit your online store.
  • Boosted average order value (AOV): You can test different versions of product bundles, promotions, and pricing strategies to get shoppers to spend more per order.
  • Reduced bounce rates: Testing different elements is one step in ensuring that your website—landing pages included—grab shoppers’ attention and align with their needs.
  • Enhanced customer experience: A/B testing can help you identify aspects of your current user experience that cause frustration or confusion, or that could simply be more enjoyable or memorable. You can then use your test results to deliver better experiences. 

A/B testing is not the only method for determining the impact of website changes, though. 

A/B testing vs. multivariate testing 

While slightly more rare, and only possible for bigger brands with a lot of traffic, multivariate testing (MVT) is an alternative to split testing. 

With A/B testing, you can test two or more versions of a web page or a specific element on a page to understand user behaviors. By extension, you can develop hypotheses about how to influence their behaviors to increase average order value or boost conversion rate, for example. These tests can get you insights faster than their multivariate counterparts. And, because they typically test significant website changes, they can often result in more impactful results. 

Alternatively, multivariate testing compares different variants of a webpage or different combinations of changes to see which one performs best. For example, you might want to test combinations of different headlines, subheadlines, hero images, and buttons in the hero section of your site. 

These two types of testing complement each other. A/B testing helps brands gather intel on their customers and develop overarching hypotheses, while MVT can help you to further optimize and find the right combination of elements. 

It must be said, however, that MVT isn’t necessary for most brands doing minimal A/B testing. Plus, for statistical reasons, it requires a big website, so it’s not worthwhile for lower-traffic ecommerce sites. 

For instance, while a simple A/B test might only have two variants, an MVT test might have a different variant for each of several changes to test all possible combinations. So, say you have two versions of a headline, a subheadline, and a button. You’d have the equivalent of nine A/B test variants. You’d need a high volume of traffic to get enough eyes on each variant to produce accurate results. 

How to A/B test your Shopify store

With A/B testing, brands can test nearly anything—header copy, layout, graphics, buttons, offers, shipping price, or entirely new elements. With so many options, it can be challenging to know where to start. Here’s a step-by-step guide to conducting A/B tests.

1. Conduct quantitative research

You may already have an idea of what you want to test. For example, you might have a landing page with a high bounce rate and think tweaking the design could reduce it. While this may be true, relying on intuition can waste time and resources. By starting with research, you can create a data-driven strategy for testing. 

At SplitBase, we use our Testing Trifecta framework to identify opportunities to improve conversion rates and other metrics. Our approach starts by examining your current sales conversion funnels. This includes digging into Google Analytics, clickmaps, heatmaps, and scrollmaps to understand the customer journey. Where do they fall off? 

Image description: SplitBase’s Testing Trifecta methodology

This quantitative research reveals what the problem is. 

2. Conduct qualitative research

Qualitative research reveals the root cause of problem areas on your site. Here, brands conduct audience research to identify the desires and pain points affecting conversion. 

There are a variety of methods to gather information about your customers, such as customer interviews, surveys, usability testing, and session recordings. Reviewing customer service chat logs is also helpful to understand customer obstacles, and Shopify stores can capture great qualitative feedback through post-purchase surveys, which can be automated using tools like KnoCommerce or Fairing.

3. Form a hypothesis

By conducting research, brands identify areas for improvement and potential solutions. Next, it’s time to put those solutions to the test.

Start by using your research to inform your hypothesis. For example, we partnered with Dr. Squatch, a personal care and organic soap brand, to conduct a site-wide audit. During our research, we noticed many customers purchased several soaps at one time. However, the product page did not have a quantity field. Instead, shoppers had to click “add-to-cart” multiple times. We hypothesized that adding a quantity field would increase quantities purchased and, therefore, boost average order value.

Image description: Adding quantities to Dr. Squatch’s product pages. 

4. Design your test

A/B testing requires careful research and planning to ensure that the results are statistically significant. (Statistical significance is the probability that outcomes in a study are not due to random chance.) For example, a press mention or unexpected news event during your testing window may skew your results. 

A/B testing is similar to a science experiment. Brands must form a hypothesis, create control and variant samples, and set testing parameters. Here are some elements to focus on when designing your test: 

  • The sample size: Start by identifying the right sample size to ensure your results are statistically significant. Many A/B testing tools will do this for you. Optimizely’s Sample Size Calculator or CXL’s Pre-Test Calculator are useful resources. 
  • Number of variations: Your sample size will affect the number of variations you can test. For example, you’d need a larger sample size to test five variations of a landing page than you would if you planned to test only two variants. 
  • The duration of the test: The length of tests will vary based on your sample size and number of variations. Remember to always conduct tests in full-week increments and test over at least two business cycles. 

If you didn’t have the appropriate sample size or number of variations, or if you didn’t run your test long enough, it would make your results unreliable. Needless to say, making optimization decisions based on misleading data is more likely to hurt your conversion rate than help it. 

5. Choose an A/B testing tool 

Numerous A/B testing tools are available, including Convert, Optimizely, and VWO. These tools simplify creating, running, and analyzing A/B tests without advanced coding skills. Shopify stores looking to test things like shipping thresholds or subscription offers, will need specialized apps such as Shipscout and Rebuy.

For tests that involve more than changing text, do not use the visual editor provided by A/B testing tools, as they auto-generate code and often cause significant browser compatibility and code issues. Always use developers to code and validate your tests. This ensures your tests are compatible across devices and browsers for accurate results. 

6. Run your A/B test

When you’re ready to run your test, split your audience into two or more randomized groups (A/B testing software automates this process). Group A will see the original version, and Group B will see the modified version. Make sure all key metrics are tracked, and collect data for each group.

7. Analyze the results

Once your test is complete, study the results. Look at the metrics that you pre-determined were key to your experiment. These could include conversion rates, bounce rates, click-through rates, or average order value to name a handful. Was your hypothesis correct? If not, what else did you learn? Most A/B testing tools will have analytics and reporting features that will help you answer these questions.

8. Iterate and test again

A/B testing is an ongoing process. One experiment can lead to new questions and tests, so use your learnings to tweak and test different elements and maximize the effectiveness of your Shopify store. 

What should you A/B test?

With A/B testing, brands can evaluate a variety of elements, such as header copy, website layout, graphics, buttons, colors, or the checkout process. Here, we’ll explore how to perform specific A/B tests for your Shopify store.

How to test shipping thresholds

Shipping thresholds are the minimum purchase amounts that customers must meet to qualify for free or discounted shipping. According to a study by Inmar Intelligence, 78% of shoppers prefer to buy more to receive free shipping. Adjusting shipping thresholds can reduce cart abandonment rates and increase AOV. 

Your shipping threshold will vary by your industry, product type, margins, and shipping costs. Start by looking at your average order value and determine a minimum purchase amount just above it. 

Use A/B testing to evaluate how your shipping thresholds will perform. For example, half your visitors receive free shipping on all orders while the other half receive free shipping on orders over $75 dollars. 

How to test upsells and cross-sells

Upselling and cross-selling are also valuable tools to increase AOV. Offering product bundling or subscriptions can motivate customers to buy more. For example, Curlsmith gives customers the option of a one-time purchase or a subscription at a discounted rate.

Image source: Curlsmith

Offering different quantities can also increase AOV. In the case with Dr. Squatch, we hypothesized that if the product page defaulted to two soaps, it would increase the revenue per user. Our A/B tests showed an increase of up to 54% in revenue per user with the proposed change.  

Image description: Dr. Squatch’s product pages default to two bars of soap. 

You can also A/B test where you cross-sell and upsell. Some places to consider adding cross-selling or upselling opportunities include: 

  • Product pages: Include “Frequently Purchased Together” or “Customer Also Bought” sections on your product pages. 
  • Cart page: For example, a fashion brand could use a “Complete the Look” section on its cart page to recommend related products. 

Image description: Everlane provides a carousel of product suggestions in their cart pages.

  • Checkout pages: Provide opportunities for customers to add relevant products to their order before finalizing the purchase. 
  • Thank you pages: Use order confirmation pages to provide special offers and suggest complementary products. 

Many brands use a mix of the above, which can provide some great inspiration for your own upsell and cross-sell flows. But, of course, testing is essential to understand what your target customers respond best to. 

How to test product recommendations

Customers can sometimes be overwhelmed by the number of options on a website. Personalized product recommendations combats this decision fatigue, motivating customers to buy. 

A/B testing helps brands determine the best way to suggest products to visitors. This includes messaging, product placement, product recommendations quizzes, or using an AI-powered chat. 

Image description: CurlSmith uses a pop-up to guide users toward a product quiz.

Image description: Haircare brand Amika includes an AI-powered chat on its homepage to suggest relevant product recommendations. 

How to test product detail pages (PDPs) 

Product detail pages include a product’s description, specifications, color options, sizing options, images, price, customer reviews, and so on. These pages should help customers understand a product’s value and guide them to the next stage of the shopping process, providing clear calls to action (CTAs) that motivate customers to purchase. 

A/B testing PDP layout, copy, or other elements is key for helping you optimize these pages to drive more conversions. 

For example, SplitBase leveraged it to help hair extension brand INH increase conversions by 26%. Using our Testing Trifecta process—the foundation of which is deep customer research—we found that shoppers were confused about how to use the brand’s products. They weren’t reading the product descriptions on the INH’s product pages. 

After testing various approaches to product demonstrations, including videos and images, we found that a combination of three GIFs worked best to boost conversion and return on ad spend (ROAS). 

Image description: Including a product demonstration in INF’s product detail pages. 

But this example is not to say that those three GIFs on the product page are a universal solution for brands who have the same problem INH did. 

Ultimately, the brands that have the greatest success with testing and optimization develop test ideas by researching what does and doesn’t work for their specific brands and customers. Our Testing Trifecta, in particular, combines both quantitative and qualitative research to reveal what’s hindering conversions and why. Armed with that info, you can then develop strong hypotheses to address those elements.

A/B testing tips and best practices

A/B testing is deceptively simple. But if A/B tests yield unreliable data, you could make changes to your Shopify store that don’t align with customer preferences—threatening loyalty and revenue. Here are five best practices to follow when conducting A/B testing.

  1. Maintain a balanced approach: It can be tempting to just focus on tests that you think will generate dramatic results (e.g., redesigning your home page). But smaller tests, such as testing how you display your shipping fees, can be just as important and inform your overall messaging strategy. When prioritizing A/B tests, include a mix of large and small experiments.
  2. Get the timing right: At SplitBase, we recommend running tests for at least three to four weeks, even if you reach the suggested sample size. Brands should see at least 100 conversions per variation, and test full weeks at a time, as performance can vary depending on the day of the week.
  3. Keep your A/B tests focused: Stick to changes that directly relate to your hypothesis or a single problem you’re looking to solve. To illustrate, say you find that customers don't buy because of a lack of trust. If you were to change your entire page layout, you wouldn’t be able to pinpoint what tweaks increased or decreased that trust. On the flip side, only changing elements that could help build trust would allow you to see more clearly what works and what doesn’t.
  4. Segment your audience: When analyzing data, segment your audience based on different criteria, such as demographics, location, or behavior (e.g., new or returning visitors). This will enable you to understand how different audiences respond to your changes.
  5. Track your A/B tests: Record each test, including the hypothesis, control group, variation group, results, and insights. Tracking your A/B tests will ensure you only test the same thing once and refine your approach for future tests.

All of the above can make or break your conversion rate optimization strategy. 

Optimize your Shopify store

A/B testing is a powerful tool to help ecommerce brands keep up with evolving customers’ preferences, improving the customer experience, increasing ROI, and driving revenue. But creating a thoughtful A/B testing strategy is critical. That’s where SplitBase comes in. 

SplitBase is the leading conversion optimization and landing page agency for ecommerce brands. We provide full-site A/B testing for your Shopify store, boosting customer acquisition, AOV, and conversions. Get a free ecommerce CRO proposal today to see how SplitBase can help optimize your Shopify store.