Paid social advertising presents an opportunity to test different ad creatives and learn what your prospects care about. Any paid social agency worth its retainer is offering creative testing. But is that testing strategy providing accurate and meaningful insights on the content strategy? I’d argue, ‘less than you think’.
During a large audit for a client, we discovered why the test and learn process is mostly flawed. Agencies are deriving misleading insights, as the testing strategy fails to account for a single hard to measure variable. Let’s look at why.
Firstly, testing strategies can be nuanced. Experts will debate:
- How much to spend per test
- How many creatives to include
- The size of your target audience
- Which metrics to use to evaluate a winner
Regardless of the approach, we’re still missing a key variable.
More sophisticated testing strategies will also provide a framework for how derive insights. This is in some senses is better. By structuring tests into phases, marketers can build from learnings in the previous phase.
Phase 1: Core benefit / headline testing
Phase 2: Imagery (lifestyle vs product)
Phase 3: Ad format (still image / video / carousel / placement)
The concept of phased testing is very appealing to performance marketers. So much to learn. So much to optimize!
Unfortunately it all comes off the rails when you don’t account for the missing variable. And measurement of it won’t show up in any of your Facebook dashboards. Curious yet?
The missing variable is “Ad Sentiment.”
Ad sentiment refers to the feedback / comments received on ads from users. One or two positive comments on an ad can create a positive sentiment. Facebook trolls are far less likely to add negative comments to an ad with positive sentiment. Previous customers are more likely to share positive thoughts once others have done so. In reverse, a couple of early negative comments can create a negative sentiment. We’ve seen that once an ad starts to go negative, trolls will pile on the negative comments and destroy any chance for the ad to win within a creative test.
You may be asking, isn’t testing for sentiment part of why you are testing to begin with? Yes. Perhaps certain ads are more likely to garner negative feedback than others. This is true, but the ad platforms’ optimization algorithms don’t appear to accurately account for this when doing creative testing. Here’s why we believe this.
We’ve been running creative tests for a client that has spent 6 figures per month, over many months. In that time we’ve performed hundreds of tests. These tests have been excellent at identifying winning ads and improving the account’s performance. But these same tests have struggled to consistently identify key learnings about WHY certain creatives win over others. Winning ad performance correlated to ad sentiment more closely than it did to any slight variations in ad creative.
Thinks of it this way. You could spin up 20 versions of essentially the same ad and target them to the same audience. Given enough time and budget, you would likely find some ads receive a positive sentiment while others do not. How much will depend on your industry and product? If you are in a sector that is polarized, it will happen more quickly. Ads with positive sentiment will start to win against the ads without. You’ll confidently pick a winner, even though there was no visible difference in the ad creative.
If you are running creative testing on paid social, you may think you are testing the value of different headlines or images. However, in reality your test is far less significant than you assumed. If you want to truly understand what creative elements are driving results, you need to account for sentiment. To do this requires careful moderation and community management, along with a more carefully managed testing process. In addition, an ad with poor creative but a long history of positive sentiment will out-perform a new ad with excellent creative but no comment history. If you are planning a creative refresh within your ad account, be careful to factor this in.