Programmatic platforms such as Google, Meta, TikTok, and X offer you many recommendations for optimising your campaigns and customer engagements, but, while these machine learning-generated best practices are powerful, they are no substitute for human insight and a test-and-learn culture in unlocking the highest return on investment (ROI) from your digital marketing budget
Image supplied. Madelene Cronje, digital campaign lead, +OneX, says machine learning is powerful, bit it is not a substitute for human insight
That’s because the recommendations from the platforms are based on the general performance of all their advertisers. They are not tailored to the nuances of your brand and your customers.
For example, we have found that Google often suggests using a broad-match keyword approach combined with smart bidding on search campaigns.
In general, this approach is effective for low-cost conversions, but for a niche brand, it can be sub-optimal because it wastes money by attracting irrelevant traffic to the website.
This is the sort of lesson that a brand can only learn through testing different tactics and approaches to see what delivers the best results.
Although this experimentation can hinder short-term campaign performance, it allows you to get better ROI in the longer term. It helps you to refine your strategies and tactics to get the best results for your brand as well as for different target audiences, products and campaigns.
Most of the platforms offer a range of basic tools to help you test different elements of your campaign, from target audiences to creative executions. More advanced solutions are available when you partner with a Google- or Meta-certified agency.
Using these tools for testing and experimentation will enable you to optimise your campaigns to reach more customers and achieve more conversions.
Where to start: Classic A/B testing
A/B testing, available on most platforms, is a good starting point. A/B testing allows you to compare two different approaches — such as audience targets, landing pages, creative executions, or keyword match types — to see which is most effective.
You can, for example, compare which strategy or execution delivers the best cost per conversion. You can set up A/B tests to compare results across a campaign, a set of ads, or a single ad.
Here are some guidelines for getting the most from A/B testing:
- Define a clear hypothesis: Understand what you want to test and link it to a business goal. For example: “Using bidding strategy ‘A’ will drive a higher conversion value on my campaign.”
- Test only one variable at a time: If you test multiple variables — such as two different audiences and two landing pages in one experiment — you won’t understand which variables contributed to the results.
- Focus on the most relevant metric for your goal: If you want to maximise sales, focus on the cost per result without getting distracted by cost per clicks or impressions.
- Don’t change your campaign during testing: This will skew the results of your test.
- Allocate enough budget: Use a budget large enough to deliver at least 100 events to ensure statistical relevance.
- Ensure your test audience is unique: Use an audience that is different to the other audiences targeted by your campaigns to avoid bias contaminating the results. Ensure the audience is large enough to deliver sufficient results when split into two segments.
- Run the test for long enough to get meaningful results: A/B tests should run for a minimum of seven days to ensure the algorithm has sufficient time to find a clear winner. Don’t pause before seven days have passed, even when it seems like there is a winning strategy. But also limit campaigns to about 30 days to avoid wasting budget on ineffective strategies.
When you’ve concluded your A/B test, analyse the data to determine the best strategy. If there is a clear winner, the platform will provide an overview of the best strategy on a cost-per-result basis and outline the margin by which it won.
If there is no clear victor, test more variables to determine if other changes will improve marketing performance. The goal of A/B testing should be to discover at least a 20% gain in performance.
Once-off A/B testing is not sufficient to ensure sustained results. Digital marketing algorithms are continuously changing, so today’s best-performing strategy may not work as well six months down the line.
This highlights the importance of establishing a testing culture in ensuring you are always getting the best returns for your marketing rand.
Advanced testing: Deeper insight into campaign performance
Brands with large digital marketing budgets and dedicated platform account managers can unlock a range of advanced testing solutions on platforms such as Google or Meta.
Most companies, however, partner with certified agencies to tap into a range of testing and experimentation offerings that go beyond A/B testing. One example is a brand lift study, which allows you to measure brand perception and awareness. Brand lift tests are available on Meta as well as YouTube through Google Ads.
These tests don’t just focus on traditional metrics such as clicks, impressions or views. They provide insights into how campaigns influence people’s perceptions of a brand by measuring brand metrics such as ad recall, awareness, consideration, favourability or purchase intent.
In these studies, surveys are served to audiences that have been exposed to your ads, and those who were eligible to see your ads but didn’t. The difference in responses determines the influence your ads have on key brand metrics.
Brand perception and awareness are among the key reasons consumers may choose one brand over another. Brand lift studies help you to gauge whether your digital communication is driving brand awareness and interest. The results give you actionable insights you can use to refine your marketing strategies and campaigns to ensure they resonate with your target audiences.
Testing and learning is your competitive advantage
When it comes to digital marketing, following the same approach as everyone else will not enable you to get breakthrough results. Marketers can’t rely on machine learning or platform best practices alone to realise the full potential of their budgets and campaigns.
It is only by combining testing, learning and human insight with platform tools that you can tailor your tactics and strategies to your brand and audience so that you can, in turn, optimise your ROI.