Running the Same A/B Testing in a Different Order Boosts Your Revenue
Here we go yo, so what, so what, so what's the (A/B testing) scenario?
When Thorin McGee referenced Wu-Tang Clan in the pages of Target Marketing magazine to discuss MarTech, he inspired me to drop in a line from A Tribe Called Quest to discuss an overlooked concept in conversion rate optimization and A/B testing for direct mail, landing pages, email, and advertising – optimal testing sequence.
Marketing isn’t manufacturing. You can’t simply repeat a rote process every day and get the optimal result every time. There are endless ways you can fit different variables together to achieve a result. A scenario analysis can help find the plan with the most ROI.
Take A/B testing for example.
It’s not just the tests you choose to run, but the order in which you run them, that can have a significant impact on your ROI.
Test Planning Scenario Tool
To help you run a scenario analysis on your brand’s A/B testing, we’ve created a free Test Planning Scenario Tool. It provides a clear, easy visual way to see the factors that go into an A/B testing program. You can move the variables around into different scenarios to discover the impact on ROI.
Once you discover the optimal testing sequence, you can then display the colorful charts along with the ROI calculations to get your dev, design and marketing teams, your clients, or your business leaders bought into your approach.
And while the order might not seem to matter much – after all, we’re talking about running the exact same tests just in a different order – the results of finding the optimal testing sequence might surprise you.
There are many scenarios you could consider. For example, does running the high effort, high return test before the low effort, low return test deliver a higher or lower ROI than the opposite.
And how does access to IT development resources affect ROI? Let me show you an example of that.
Scarce Development Resources Example
What if you have a series of three tests to run, and the test cannot be developed simultaneously. Also, the next test can’t be developed until the previous test has been implemented.
Perhaps this is because of your organization’s development methodology (Agile vs. Waterfall, etc.) or there is just simply a limit on your development resources (they likely have many other projects to work on in addition to developing your tests).
Let’s look at this scenario if you had three tests you were planning to run.
Test 1 (Low level of effort, low level of impact)
- Business impact – 10% more revenue than the control
- Build Time – 2 weeks
Test 2 (High level of effort, high level of impact)
- Business impact – 360% more revenue than the control
- Build Time – 6 weeks
Test 3 (Medium level of effort, medium level of impact)
- Business impact – 70% more revenue than the control
- Build Time – 3 weeks
In this scenario, Test 2 first, then Test 1 and finally Test 3, along with Test 2, then Test 3, then Test 1 were the highest-performing scenarios.
The lowest-performing scenario was Test 3, Test 1, Test 2.
Now here’s the real kicker. The difference was $894,000 more revenue from using one of the highest-performing test sequences versus the lowest-performing test sequence.
Again, running the same exact tests you were going to run anyway. The only different is running them in the optimal sequence.
Bring Home the Bacon With the Optimal A/B Testing Sequence
You won’t get the same exact revenue difference for your organization. Perhaps you don’t have as extreme of an IT development shortfall. Perhaps you generate less revenue overall or your tests have less impact.
Regardless of what the size of revenue impact is…it’s there. And it’s real. Because whichever test you run first has a much longer opportunity to impact the final financial numbers, however, the longer it takes you to get an improvement the less time you have to benefit from it. These are important variables that need to be considered when planning your A/B tests.
And when do you find your optimal testing sequence, you might end up singing another verse from A Tribe Called Quest: “Gots to get the (A/B-testing-driven) loot so I can bring home the bacon”
Doing some A/B testing for your direct mail, email marketing, website, or advertising? Get an instant download of your free Test Planning Scenario Tool.
Daniel Burstein is the Senior Director, Content and Marketing at MECLABS Institute. Daniel oversees all content and marketing coming from the MarketingExperiments and MarketingSherpa brands while helping to shape the marketing direction for MECLABS — digging for actionable discoveries while serving as an advocate for the audience.