Advertising

Connected TV Advertising: A/B Testing for the Performance Marketer

Connected TV Advertising: A/B Testing for the Performance Marketer

6 Min Read

A diverse suite of creative assets not only produces enough variety to freshen up your campaigns – it’s also an invaluable testing tool. The benefits are clear, as advertisers can expect improved engagement, increased conversions (on average 20% higher for e-commerce brands), reduced cart abandonment, and an overall positive customer (and advertising) experience. Advertisers who don’t include A/B testing in their marketing strategy risk wasting their budget on activities that don’t move the needle. The lesson there is if you don’t test aspects of your ad strategy, you’re open to making the wrong decisions. 

Testing can be less daunting by breaking down the invisible barrier between performance and creative teams, and giving them a platform to collaborate. Here are our top best practices to take into consideration.

1. Sweat the Small Stuff

Yes, the minor details matter in A/B testing. We recommend choosing one element and sticking to it, whether that be a button, call-to-action copy or color. The more elements you add to the mix, the harder it becomes to distinguish what is making a difference. The sweet spot? Test between two and four variations at the same time for the best balance of test duration and efficiency. When applying such principles to various digital marketing channels like Connected TV advertising, this can be done in a number of ways. For example, two stop-motion videos can be tested side by side by swapping out background or text color. Thanks to our partnership with video production house, QuickFrame, it’s even easier for advertisers to make these tweaks to help support your testing schedule if you already have existing brand assets to work with. It takes a matter of days (not weeks!) to get a stop-motion video up and running, and even shorter to make edits on the fly. 

After you’ve tested out each element and gauged which one is driving performance, then go ahead and test another element until your conversion rate is maxed out. The video below demonstrates how easy it is to upload multiple Connected TV advertisments onto our Performance TV solution, where advertisers can manually adjust the weighting based on what creative is the ‘winner’. We then add another level of sophistication by enabling advertisers to also adjust the related ads that will be paired with each Connected TV ad. These can be switched out at any time during the campaign, although our recommendation is to let the test run through to completion instead of making live changes midway.

2. Keep on Keepin’ On

Unlike a big splash campaign, A/B testing is not a one-off activity. It takes discipline and consistency to get results, where data speaks volumes and gut instincts fall to the wayside. A solid test is one with a hypothesis – this frames what you want to test, the reasons behind testing and what changes you expect to see after you make the test.  We separate it into three parts:

  1. Variable – Any element that can be modified, added or removed.
  2. Result – The predicted outcome, whether this be X% more conversions, click on a call-to-action.
  3. Rationale – What you know about your users from your research that indicates your hypothesis is correct.

Thread these three elements together to form a sentence based on this format “If  [Variable], then [Result], because [Rationale]” and there you have your hypothesis.

We suggest running a test for the right amount of time to ensure data credibility – the ‘right’ amount of time varies depending on the test goal, the number and complexity of variants and your website’s daily active users. Once you’ve run your test for the optimal amount of time, you’re almost guaranteed that these data trends will keep up in the long run, which helps with any future marketing decisions. Another thing to note is the timing of your test – for example, website traffic to your site over the holidays or peak spending periods is likely to be different than other parts of the year. Running the same test at multiple times of the year can help paint a more accurate picture. 

How can advertisers possibly keep track of all of this activity, and easily refer back to it later on, especially if there are multiple campaigns and A/B tests running at the same time? A solid reporting interface is key. MNTN has perfected its analytics dashboard, which enables advertisers to tie back their test goals to metrics and view it in all types of formats to be able to get a better read of the data, such as a table format as illustrated in the video below.

3. Patience is a Virtue

You might be tempted to tinker (worst still, pause) your A/B test midway as you’re not seeing any results, but this could be detrimental in the long run. Two words that you should always keep in mind when thinking of cutting a test short: Statistical significance. Economists, data scientists and performance marketers alike rely on this to make sure their data is credible, and not by happenstance. In order to arrive at this point, there needs to be a large enough pool of data, as this often provides more accurate results and makes it easier to identify patterns that fall on either side of the mean (average) result.

Once you’ve gathered the results, then make sure that all of these are documented to serve as a reference for any future A/B tests. MNTN’s reporting interface allows advertisers to customize reports and templatize them in a few swift clicks. What metrics and KPIs should or shouldn’t be included in your reports? We suggest aiming at minimum for bottom line metrics like the number of conversions for each variation, conversion rate, any conversion uplift (positive or negative) and number of visitors/viewers. You can also consider how these metrics are performing per audience segment, for example – if you’re targeting a single creative against different groups and want to see which one engages most with your ad. Below we’ve shown how to retrieve your A/B test campaign, and then add or remove metrics to ensure that these are aligned with your test goals.

4. Customize Metric View

How can advertisers possibly keep track of all of this activity, and easily refer back to it later on, especially if there are multiple campaigns and A/B tests running at the same time? You need a way to filter and access the metrics most important to your test goals. MNTN’s reporting interface helps advertisers identify their test campaigns, and then add or remove any metric view, which will automatically be reflected in the table view, as the video below demonstrates. 

This is only a fraction of what MNTN can do for your A/B testing efforts. If you’d like to know more, we’d love to show you around our platform and learn more about your team’s goals and challenges.