Making A/B Tests Fruitful in the Long Run

AB testing, a.k.a. split testing or controlled experiment, is essentially an experiment that compares multiple versions of a determinant such as ad, landing page, audience etc. This experiment usually is followed by statistical analysis to determine which version performs better. By collecting both qualitative and quantitative data, A/B testing allows you to improve conversion funnel structure and, therefore, better experience for your customers and users. Proper A/B Test reduces bounce rate, churn rate and helps you to get higher ROI.

In this post, I talked about how to implement these strategies into your creative test step by step.

Step 1: Define your hypothesis

Before starting the creative test process, the first question needed to be asked is: “ What do I want to test in creative?”. Obviously there are many elements you can test, and common ones are:

  • Logo
  • Value Proposition
  • Image(colour scheme, backgrounds etc.)
  • Tagline/Text
  • CTA Button
  • Video(Length, colour scheme, backgrounds etc.)

The biggest trap marketers fall into is choosing multiple elements at the same time. Let’s say you want to test an image, tagline and CTA by creating 2-3 ads that include different versions of those elements. However, even though you get significant different results, you can’t tell which exact element causes this difference. Instead, it is better to go “one step at a time”. Configure which image works better, pause the ones performing worse and create new ads with the successful image then test different CTA’s or Tagline.

Step 2: Choose platform and audience

While you are focusing on which element of ad worth to test, you also need to consider the acquisition channel and platform where your ads will be published. Competition and specs differs from channel to channel. So instead of using one type for all channels, try to make ads relative to platforms by adjusting them across different channels.

Secondly, defining your targeted audience beforehand is an important step. Audiences determine your tagline or value proposition on the ad. Given this difference, you should run an A/B test that is targeted to one audience or the other. For instance; visitors to your landing page have a specific intent: to learn more and maybe take action on the promotion highlighted. For those users, complex ad content would decline CTR and traffic. On the other hand, as your existing customers are aware of your product or service, you could create more tailored ads(such as pixel-based dynamic ads ) to encourage them to take action.

Step 3: Pick your ad type

While testing creatives, using different ad types can’t provide a reliable conclusion for your goal. If you will test static vs. video ads, it is not hard to estimate the results. Therefore testing elements for the same ad types is always recommended. But, how do you decide which ad type you should go on with?

If your budget and time is tight, then static ads are an easy and quick way to go. Incorporating still images would save you a lot of time and money. On the other hand, if you want to set a more comprehensive ad set, the rich sources of dynamic ads would give you substantial insights.

Step 4: Run experiment and measure

After you publish your new ads, you need to let them collect data and traffic before starting to measure their performances. For instance, for app campaigns converting at least 50-100 installs help your ad set to reach statistical significance. When better performing ads are the same and no longer changes day-over-day, your A/B test reaches the consistent point and test results will be far more reliable.

Every marketer knows about top-funnel metrics such as CTR, IPM or CPA in order to compare their ads. However, the concern here is those top-funnel metrics might not show real winners, and you need to consider other performance-gauging metrics depending on your campaign strategy. If your campaign strategy is acquiring new active users while keeping it profitable, you should come down to down-funnel conversions such as conversion value, ROI/ROAS, LTV and retention rates. Dive deeper by analysing conversion cycles for every ad to understand which one is more persuasive and encouraging for your customers. However, unless you do want to rely on fewer conversions to make decisions, optimizing for revenue-based metrics might cost you more than traffic-based metrics.

Extra Step: Reaching statistical confidence

Statistical confidence is an essential decision-making tool for A/B test monitoring, yet for people without statistics background, applying it might be challenging. Although I will talk about it in the separate post later, let me explain it plainly now. As I mentioned on Step 4, letting your A/B test collect enough data/sample is a must in this process. Most of the time, what marketers ask is: “How many samples do I need to collect before deciding which ad is successful?” Calculating your statistical confidence gives an answer for that question. And if you want to implement this step into your process, you need to put it just before step 4.

In an ideal world, we assume that the sample we are collecting represents a whole potential and existing users. But, this is 99% untrue, in a real world. Reaching statistical confidence corresponds to the fact that your ads collected enough samples to make predictions or changes confidently. Calculating statistical confidence might be frustrating as it includes various statistical approaches.

Conclusion

Creative tests are an essential part of healthy UA activity. By experimenting with different ad elements, you can build up knowledge about which performs better. Be careful not to overcomplicate your test. Pick one feature to test at a time across different ads, and monitor which one is successful. When the ad beats is found, you need to continue with a new A/B Test for other elements. Because every ad experiences fatigue, therefore, their performance will deteriorate. In order to benefit from the rewards of creative tests, ads need to continue a testing cycle.

Sources

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s