< ALL POSTS

Rules for mobile ad creative testing in 2021

25 October 2021

“A good ad creative” — what exactly does that look like? There is no set answer for this, it’s always different. After all, “good” is a very subjective definition. How to make sure that your understanding of good matches with your potential audience’s one? For example, you can do so with ad creative testing. In this article, experts from the performance marketing agency AdQuantum show you how to get ready for this testing (not to be confused with A/B testing), how to conduct it efficiently and use the testing results correctly.

Why tests are conducted

Testing is often neglected by mobile marketing teams because it requires extra time and investment. Of course, it is much easier and faster to immediately start running ads without undertaking extra steps. But it’s much better not to ignore testing. These tests will save you nerves, money, and time in the long run. And here’s why:

What to test and how to conduct the testing

It is not always possible to determine what exactly catches a user’s attention in a particular ad, so you can test literally anything in creatives. In addition to testing the creative concept itself, multiple details can be put to the test. It can be composition, contrast, the number of key elements, the text on the creative, colors, the look of the Call to Action (CTA) button, timing of the video. And this is not a complete list.

For example, in the creatives for the mobile game, Idle Space Farmer, we tested the rendering style, color, text, and shape of the CTA button, additional text on the creative, the girl’s appearance, her location, and the background. The below collage demonstrates the initial, intermediate, and final versions of the creative made from the same hypothesis. It was testing that helped us determine what engages the audience best.

On average, testing lasts for 1-2 days. This is enough time to get the first metrics that would show you how to proceed. The number of testing iterations depends on how many creatives are being tested simultaneously. 

In terms of money, testing usually takes up to 15% of the total ad budget. Thus, even if tests show poor results, you do not lose in the overall campaign performance. After all, the testing budget is never wasted. For the amount of money spent, you always gain invaluable experience. Already after the first test iteration, you obtain an understanding of which ads actually interest your target audience and which don’t.

How to prepare for the testing

In order not to be flying blind and get the most out of the creative tests, you need to properly prepare for them. These are the points to stick to for getting objective results and design the most efficient ad creatives: 

The proper order for testing creatives

There are 3 options for creatives testing: simultaneously, sequentially, and with A/B testing.

You save precious time when you test ad creatives all at once. When you have a huge amount of creatives done and ready to be launched, it would be better to test sequentially.

It is controversial how effective the simultaneous running of a large number of creatives is. Let’s say you are running a Facebook ad campaign. Out of all creatives, Facebook algorithms will choose only a few top ones for further running. They will win and consume all the available traffic. And there cannot be a large number of creatives in the auction at the same time. One way or another, this is always a rotation: the old dies and gives way to new, fresh approaches.

If we are talking about a large number of ads in an ad campaign (up to 50), sometimes this works even better than running 5-7 creatives. But you cannot predict in which case it will work. Either way, you should test. There are no clear patterns with this option, but mostly it relates to gaming projects.

Automatic or manual testing?

Let’s say Facebook is your main traffic source. If you upload several ad creatives to the ad set, the social network will direct most of the traffic to the ones that, according to Facebook algorithms, would bring the most conversions. Sure, this is convenient and optimizes the work of UA managers, but with this test format, there is a risk of missing out on potentially good creatives, simply because they didn’t get enough traffic. Creatives can lose out on traffic just because inside the ad set next to them, there was a “stronger” creative, according to internal Facebook algorithms.

That is why it is important to test manually, directing traffic to creatives separately, to estimate key metrics. And only then, after having collected several successful hypotheses, to combine them in full-scale ad campaigns.

At the stage of testing and searching for a working hypothesis, only manual campaign running is used. When additional insights emerge based on effective creatives, dynamic creatives, the internal functionality of Facebook, are involved in the work. This tool automatically creates new setups: image/video + text + headline + CTA. It then optimizes ads on its own and only shows the best-performing ads. As a result, you can see the statistics — in which creative, which element worked better than the others.

Throughout the entire ad campaign, two approaches are combined. Sometimes it is impossible to work without manual testing, otherwise, you will not understand what actually performs. But sometimes it is better to use automation since there is simply no point in wasting the buyer’s time.

How to evaluate test results

We analyze the test results in the same way as we set benchmarks: 

How has testing changed since iOS 14.5+ launched?

Too little time has passed to draw objective conclusions: not all users have switched to iOS 14.5+ yet, and not that much time has passed so far. But the performance of iOS UA campaigns has changed significantly. All data is now only available at the campaign level, not on the creatives level.

If each of the ad groups is targeted at different audiences we cannot objectively assess the effectiveness of a particular ad — because we receive events for the whole ad campaign.

But there is a solution: test creatives on Android and transfer successful ones to iOS.

Better safe than sorry — and we completely agree with that. Do you still have questions? Let’s talk.

Want to get the latest from AdQuantum?
Subscribe to our newsletter:
By submitting this form, I confirm that I agree to the collection and processing of personal data by AdQuantum, as further described in the Privacy Policy.
Keep reading
AdQuantum's Pavel Belov discusses app monetization and the basics of unit economics
Another edition of our Home Fitness for Weight Loss case study. Now we’re diving deeper into our SKAN user acquisition strategy.
AdQuantum web-site will use cookie-files and collect your personal data. By using our website, you agree with our Terms & Policies. Details Accept