Already we have intelligent tracking prevention, tech platforms restricting how much data they pass through their ad and analytics platforms, browsers and operating systems are limiting the use of ad tracking and, come 2024, Google is likely to enact its plans to deprecate third-party cookies.

But this direction of travel is not just being driven by tech companies and platforms, but by regulators around the world. It’s estimated that over the course of 2023, 65% of the world’s population will have its personal data covered by modern data privacy regulation, up from 10% in 2020.

With so much signal loss, how do we find a model that can withstand these challenges?

Untitled design (2)

Digital attribution has never been easy, but never more so than now. Which is why we need to rethink how we approach both measurement and attribution, as one side of the same coin, to create a much more durable solution.

Finding a durable single source of truth

There are many different models for attribution, but all have their respective flaws -often two static, or not giving us a strong enough picture.

Some models, such as market mix modelling (MMM), give us something close to a full market picture, but are often too static to allow us to make real-time decisions. Others, such as last-touch or platform-specific “walled garden” models update in real time but give us a very incomplete picture of our activity.

The way we counter this is to combine those competing models to create real-time econometrics, which will provide us with a full-picture view (or as close to it as possible), that is updated in real-time. It means we take the statistical approach that is used with MMM and, using machine learning techniques, supercharge the system so that we can make much more proactive decisions.

We also counter this signal loss by finding comfort in probabilistic attribution – attribution that makes assumptions based on probably behaviours – to compliment the more deterministic data that we do retain. Real-time econometrics helps us to correlate changes in inputs to changes in outputs, meaning we can continue to judge the impact of individual decisions and optimisations even we experience that data signal loss.

Better decisions, made in real-time

This model works by taking a range of inputs that can all influence the performance of your marketing at any given stage. These can include:

  • Spend data.
  • Historical metrics, such as organic search performance, click-through-rates, conversion rates, and impression share.
  • Company metrics, such as your size, industry and customer demography or ethnography
  • Contextual metrics, such as market conditions, seasonality public holidays, etc.

These inputs then feed into our model, providing insight and measurement outputs that allow us to make more proactive decisions around performance planning, performance attribution and performance optimisation.

Don’t drown in data – know what to measure and when

With so many channels and platforms, one of the key challenges for marketers is having a sense of perspective and making sense of those huge volumes of data, knowing what to look at, and when. The answer there, lies in hierarchy.

One model to consider here is one that breaks down data into “streams”, “rivers”, “lakes” and “oceans”. This model allows marketers to cluster and categorise their data sources in a way that makes it easier to look at performance at the most appropriate level to make the right optimisation decisions.

In this instance, our streams may be individual platform data, each of which provide their own insight as to individual platform performance that allows us to make day-to-day optimisations, but don’t necessarily make it easy for us to make broader strategic judgements on optimal channel mixes.

This is where our streams flow into rivers, combining multiple streams to give wider channel views. This allows us to take all our data metrics from individual social media platforms and combine them into a ‘social’ channel cluster, our web analytics and PPC platforms and combine them into a ‘search’ cluster, and our adtech platforms combine to make an ‘addressable’ cluster. Taking this approach allows us to compare channel activity much more easily and on something as close to a like-for-like basis as is feasibly possible.

As these rivers flow into lakes, we can compare “digital” activity as part of a real-time econometrics model to determine outcomes for digital performance overall, and eventually into the ‘ocean’ that is our market mix modelling, where we can make those longer-term cross-channel decisions.

Conclusion

The direction of travel for measurement and attribution is only likely to continue towards greater privacy, greater user protection and less readily available access to data for advertisers. That is forcing our hand to come up with stronger attribution models that can provide us with the right insight, despite growing levels of data loss.

This means having models that are more resilient and durable, working in real-time and balancing both deterministic and probabilistic data to help us understand the performance of our activity, even with growing data loss.

Signal loss means that econometrics will play an increasingly important role in cross-channel decision making, and the brands that continue to drive – and understand – performance will be those that leverage a real-time econometric attribution solution to enable future-proof their digital planning.