Documentation Index
Fetch the complete documentation index at: https://docs.aftersell.com/llms.txt
Use this file to discover all available pages before exploring further.
Post-Purchase A/B Testing
A/B testing (also known as split testing) allows you to compare up to five versions of a post-purchase offer within a single funnel to determine which performs better. Each version is called a layout (Layout A, Layout B, Layout C, Layout D, Layout E). Traffic is automatically split evenly across all active layouts, and performance is tracked separately so you can identify the higher-performing version. By testing variations of your offers, you can improve conversion rates, increase revenue, and make data-driven decisions about your upsell strategy.A/B testing lets you manually configure each layout as a complete offer and test them against each other. If you want to define individual variables (like discount type, timer duration, product type) and have Aftersell combine them into all possible layout combinations for testing, use Multivariate Testing instead.
A/B testing is only available for post-purchase (one-click) upsells. A/B testing for Thank You page offers is not currently available. For checkout widget A/B testing, see Checkout A/B Testing.
What you can test
You can test nearly any element of your post-purchase offer, including:- Products: Different products, single product vs. multi-product offers, or AI recommendations vs. specific products
- Discounts: Different discount amounts or types (percentage vs. fixed dollar)
- Copy: Headlines, descriptions, and call-to-action text
- Images: Product images or lifestyle photos
- Urgency elements: Countdown timers with different durations
- Quantities: Default quantity of 1 vs. 2 (or more)
- Offer design: Different layouts or visual styles
How to set up an A/B test
Step 1: Create or open a funnel
- Open the Aftersell app from your Shopify Admin
- Navigate to Post-Purchase Funnels in the left sidebar
- Either create a new funnel or open an existing funnel you want to test
Step 2: Create your test
- In your funnel, click Create Test
- Select A/B Test from the options
Step 3: Set up your layouts
Configure each layout as a complete offer. You can create up to 5 layouts (Layout A through Layout E). For each layout, you can customize:- Product selection
- Discount amount and type
- Offer copy and messaging
- Images and visual elements
- Countdown timer settings
- Default quantity
- Any other offer settings
Step 4: Start the test
- Review all layouts to ensure they’re configured correctly
- Click Start Test to begin splitting traffic across layouts
- Ensure your funnel is enabled and published
How traffic is split
Traffic is automatically distributed evenly across all layouts in the test. There is no way to manually adjust the traffic percentage for each layout. For example:- If you have 2 layouts, traffic will split 50% / 50%
- If you have 3 layouts, traffic will split approximately 33% / 33% / 33%
- If you have 4 layouts, traffic will split 25% each
- If you have 5 layouts, traffic will split 20% each
Managing your test
Once your A/B test is live, its status will show as In progress. From here, you can pause, edit, reset, delete, or select a winner.Test statuses
- Not started: The test has been created but is not active. Traffic is not being split. If you see 100% of traffic going to one layout, click Start test.
- In progress: Traffic is being split evenly across all layouts.
- Paused: Traffic splitting has stopped.
- Finished: The test has completed. Results are available in the Analytics > Tests tab.
- Not Running: The test has been created and configured but is not currently active. This may appear if the test was paused or stopped.
Pause the test
Click Pause to temporarily stop traffic splitting. When a test is paused:- Traffic is no longer split between layouts
- Only the first layout created, Layout A, will be shown
- You cannot choose which layout displays while paused
Edit the test
Use the three-dot menu to select Edit test if you need to adjust products, pricing, layout, or messaging. For accurate results, avoid making major changes while a test is running.Reset analytics
Select Reset analytics to clear current test data and restart tracking from zero. This resets only the A/B test data. Historical lifetime offer data is not deleted.Delete the test
Select Delete test to permanently remove the A/B test. If deleted:- Traffic will no longer be split
- The offer returns to normal behavior
- Lifetime offer analytics become visible again
Select a winner
When you are ready to end the test:- Click Select winner
- Choose the layout to keep
- Confirm
How A/B test analytics work
Once your test is running, you can track performance in the Analytics section of your Aftersell dashboard under the Tests tab. For a complete breakdown of all Tests tab metrics, including group-level analytics, Performance by Variable, the All Groups table, and how to select a test, see Post-Purchase Analytics.Limitations
A/B testing is within a single funnel only
A/B testing compares layouts within the same funnel. You cannot A/B test one funnel against another funnel.A/B testing is for post-purchase (one-click) upsells only
A/B testing is not currently available for Thank You page offers. It is only supported for post-purchase one-click upsell funnels.Custom traffic percentages are not supported
Traffic is always split evenly across all active layouts. There is no option to assign custom percentages to individual layouts.Best practices for A/B testing
Run tests long enough
- Minimum duration: 2-4 weeks
- Minimum impressions: At least 100-200 impressions per layout (more is better)
- Shorter tests may not produce statistically reliable results, especially with low traffic.
Test one element at a time
When you change multiple elements simultaneously, you won’t know which change drove the results. Test one variable at a time for clear insights: Good: Test 10% discount vs. 20% discount (one variable) Avoid: Test 10% discount + Product A vs. 20% discount + Product B (two variables)Look for consistent performance
Don’t make decisions based on a single metric. A winning layout should perform well across multiple metrics:- Higher conversion rate
- Higher revenue per visit
- Comparable or better average upsell value
Consider statistical significance
Before declaring a winner, ensure your results are statistically significant. Look for:- Clear performance differences (not just 1-2% variations)
- Consistent trends over time
- Sufficient sample size (impressions)
Apply the winning layout
Once you’ve identified a clear winner:- Stop the test
- Apply the winning layout to your funnel
- Monitor performance to ensure results remain consistent
- Consider running a new test to further optimize
A/B test vs. Multivariate test
Not sure which testing method to use?| A/B Test | Multivariate Test | |
|---|---|---|
| How it works | You manually configure each layout as a complete offer and test them against each other | You define individual variables and their options, and Aftersell combines them into all possible layout combinations for testing |
| Layouts | Up to 5 layouts (Layout A through E), each manually configured | Automatically generated based on variable combinations (Layout A, B, C, etc.) |
| Best for | Comparing complete offer designs against each other | Finding the best combination of individual variables (discount, timer, product, etc.) |
| Traffic split | Evenly across all layouts | Evenly across all combinations |
| Statistical significance | Faster to reach with fewer layouts | Requires more traffic due to more combinations |
| Results | Easier to interpret - one layout wins | More complex - identifies best combination of variables |
- You want to compare complete offer configurations
- You have moderate traffic levels
- You want clear, straightforward results
- You want to test multiple variables simultaneously and find the best combination
- You have high traffic levels
- You want to define variables and have Aftersell combine them into layout combinations for testing
Common A/B test ideas
Need inspiration? Here are proven A/B tests to try:Product-based tests
- Same product vs. complementary product: Test whether customers prefer to buy more of what they just purchased or try something new
- Single product vs. multi-product offer: Compare a single item offer against a multi-product offer with complementary products
- AI-powered vs. static product: Test dynamic AI recommendations against manually selected products
Discount tests
- Discount amount: Test 10% off vs. 20% off vs. 30% off
- Discount type: Compare percentage discounts (20% off) vs. fixed dollar amounts ($5 off)
- No discount vs. discount: Test whether a discount is necessary for your audience
Quantity tests
- Default quantity: Test quantity of 1 vs. quantity of 2 (especially effective for consumables)
Copy and messaging tests
- Headline variations: Test different value propositions or messaging angles
- Urgency messaging: Test with vs. without urgency language
- CTA button text: Test different call-to-action phrases
Design tests
- Long-form vs. short-form: Test detailed product descriptions against concise offers
- Image variations: Test different product images or lifestyle photos
- Timer duration: Test 5-minute vs. 10-minute vs. 15-minute countdown timers