Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.aftersell.com/llms.txt

Use this file to discover all available pages before exploring further.

Checkout A/B Testing

A/B testing (also known as split testing) lets you compare two versions of a checkout widget to determine which performs better. A/B testing is currently only available for standalone widgets and is not yet supported for pages. Each version is shown to a portion of your checkout traffic, and performance is tracked separately so you can identify the higher-performing version. By testing variations of your offers, you can improve conversion rates, increase revenue, and make data-driven decisions about your upsell strategy.
This page covers A/B testing for checkout widgets only. If you are looking to test your post-purchase offers, see A/B Testing for Post-Purchase Offers.
Checkout widgets are only available to Shopify Plus merchants due to Shopify’s Checkout Extensibility API restrictions.

How checkout A/B testing works

Standalone checkout widgets support two versions within a single block. When both versions are active, traffic is split 50/50 between them, labeled Version A and Version B in the analytics view. Each version is shown to a separate portion of customers, and their performance is tracked independently so you can compare results and identify a winner. Each version has its own configuration and settings, making it possible to test meaningful differences between offers.
A/B testing is currently only available for standalone widgets. Pages do not support A/B testing yet.

What you can test

A/B testing is supported on both upsell widgets and content widgets. Upsell widgets - You can test:
  • Products: Different products, product types, or recommendation strategies (specific product vs. AI recommendations vs. most expensive product)
  • Discounts: Different discount amounts or types (percentage vs. fixed dollar)
  • Widget mode: Single product upsell, multi-product upsell, or checkmark upsell
  • Triggers: Since standalone widget triggers are configured at the version level, you can test different targeting conditions against each other
  • Blank (Skip): Use an empty version as a control to measure whether showing an upsell helps or hurts checkout conversion
Content widgets (trust badges, testimonials, cart controls, notes, text, images) - You can test:
  • Different badge or testimonial content
  • Different messaging or copy
  • Showing vs. not showing a content widget (using Blank/Skip as a control)

Test types

When adding a new version to a standalone widget, you choose one of three test types:

Duplicate content

Duplicate content creates a new version that is a copy of an existing version. Use this when you want to test a small change - such as a different discount amount or a different product - while keeping everything else the same. Because the starting point is identical, any performance difference is more likely to be caused by the single change you made. This is the most common starting point for A/B tests because it minimizes variables and makes results easier to interpret.

New content

New content creates a blank version that you configure from scratch. Use this when you want to test a fundamentally different offer - such as a different widget mode, a completely different product category, or a different trigger strategy. New content versions give you full flexibility but require more setup time. This type is best suited for broader tests where you want to compare two distinct approaches rather than a single incremental change.

Blank (Skip)

Blank (Skip) creates an empty version that shows nothing to the customers it is assigned to. Use this as a control group to measure the baseline - what happens when no widget is shown at all. Comparing a Blank version against an active offer helps you understand whether your widget is helping or hurting checkout conversion. This is especially useful if you are concerned that showing a widget might increase checkout abandonment for certain customer segments.

Setting up a checkout A/B test

Step 1: Open or create a standalone widget

  1. Open the Aftersell app from your Shopify Admin
  2. Navigate to Checkout in the left sidebar
  3. Scroll to the Standalone widgets section
  4. Either open an existing standalone widget by clicking the pencil (edit) icon, or click Create widget to create a new one

Step 2: Add a version

  1. Inside the widget editor, add a new version
  2. Choose one of the three test types: Duplicate content, New content, or Blank (Skip)
  3. Configure the new version - set the product, discount, widget mode, and any other settings you want to test

Step 3: Configure triggers for each version

  1. Open each version and set its triggers
  2. Triggers are configured at the version level for standalone widgets, so each version can target the same or different customer conditions
  3. For a fair A/B test, set both versions to the same trigger conditions so they compete for the same audience
For a full list of available trigger options, see Checkout Triggers.

Step 4: Enable both versions

Ensure both versions are toggled to active. Inactive versions do not receive traffic - both must be active for the 50/50 split to take effect.

Step 5: Verify placement in the Shopify Checkout editor

  1. Open the Shopify Checkout editor from your Shopify Admin
  2. Confirm the app block for this standalone widget is placed in your desired location
  3. The widget will not display to customers until it has been placed
Your A/B test is now live.

How traffic is split

Checkout A/B tests use a 50/50 split between two active versions (Version A and Version B). Traffic is evenly distributed between the versions automatically and cannot be manually adjusted. If one version is deactivated, all traffic goes to the remaining active version. To run a valid A/B test, both versions must be active at the same time.

Managing your test

Once your A/B test is running, you can manage it at any time from within the widget editor.

Pausing a version

Toggle a version to inactive to pause it. When a version is inactive, it no longer receives traffic. The remaining active version will receive all traffic. Use this if you need to temporarily stop showing a specific version without deleting it.

Editing a version

Open the version and make your changes directly. For accurate results, avoid making significant changes to a version while the test is running, as this can make it harder to interpret what drove any performance differences.

Deleting a version

Delete a version to permanently remove it from the widget. Once deleted, the version and its data cannot be recovered. All traffic will go to the remaining active version.

Selecting a winner

When you are ready to end the test and commit to one version:
  1. Identify the higher-performing version based on its analytics
  2. Deactivate or delete the losing version
  3. The winning version will now receive 100% of traffic

Understanding your A/B test results

When an A/B test is running, you can view Version A and Version B performance by clicking the bar chart icon next to the widget on the Checkout dashboard. The metrics shown depend on the widget type - upsell widgets and content widgets show different metrics. For a complete breakdown of all widget-level metrics, per-version analytics, and the Analytics tab Checkout graphs, see Checkout Analytics.

Limitations

A/B testing is within the same widget type only

A/B testing in Aftersell compares two versions of the same widget type. For example, you can test two different trust badge configurations against each other, or test an upsell offer against Blank (Skip). You cannot A/B test across different widget types - for example, you cannot test a trust badge widget against an image widget in the same A/B test.

A/B testing is not available for pages

A/B testing is currently only available for standalone widgets. Pages do not support A/B testing yet.

Placement testing is not supported

You cannot A/B test different widget placements (for example, testing whether an upsell performs better in the order summary vs. above the payment section). A/B testing compares the content within a single widget in a fixed placement.

Best practices

Test one element at a time

When you change multiple elements at once, it becomes difficult to know which change drove the result. Isolate one variable per test for clear, actionable insights. Good: Test 10% discount vs. 20% discount (one variable changed) Avoid: Test 10% discount + Product A vs. 20% discount + Product B (two variables changed at once)

Use Blank (Skip) as a control

Running a Blank (Skip) version against an active widget is one of the most valuable tests you can run. It tells you whether showing the widget at all is helping or hurting your checkout conversion rate. If the Blank version performs better on checkout completion, it may be worth reconsidering the offer entirely or adjusting your targeting.

Monitor checkout abandonment

A higher upsell conversion rate is good, but make sure it is not coming at the cost of increased checkout abandonment. Check the Checkout abandonment rate in Analytics > Checkout tab regularly while running A/B tests. For content widgets, the per-version analytics view also includes a Checkout abandonment rate metric.

Common test ideas

Upsell widget tests
  • Specific product vs. AI recommendations vs. most expensive product
  • Complementary product vs. same product (replenishment)
  • 10% off vs. 20% off
  • Percentage discount vs. fixed dollar amount
  • Discount vs. no discount
  • Single product upsell vs. checkmark upsell
  • Multi-product upsell vs. single product upsell
  • Active offer vs. Blank (Skip) to measure the net impact of showing an upsell
Content widget tests
  • Trust badge with vs. without specific badges
  • Different testimonial content
  • Different note or text messaging
  • Active content vs. Blank (Skip) to measure the impact of showing the content widget

Need help?

If you have questions about setting up or analyzing your checkout A/B tests, chat with our support team using the live chat at the bottom right of the app.