Attribution Doesn't Matter

I know what you're thinking: "If I have to hear about attribution one more time I am going to quit marketing and move deep into the woods and pick up bushcraft". I don't blame you. In fact, I might join.

There's no denying that attribution is a hot topic recently (not like it hasn't been talked to death for years), but for good reason. Attribution is meant to be a solution to one of the greatest problems advertisers face, which is figuring out exactly how successful their campaigns are. Seems like a pretty simple ask, but it's remarkably difficult to solve, especially with the changes that hit the data world back in 2020.

You know, CCPA, iOS 14.5+, GDPR. Ever heard of them? Of course you have.

So, if it's been talked to death already, why am I writing this? Good question, and the answer is simple:

ATTRIBUTION IS FAKE AND IT DOESN'T MATTER

Profound, I know. Brazen, of course. Accurate, possibly?

In my career as a marketer I've been across the entire attribution map. Everywhere from 'what is that' to 'this is important' to 'this doesn't matter'. I suppose you live long enough to see yourself become the villain after all.

In all seriousness, I have become especially skeptical of attempts to solve attribution problems recently, which has led to where we are now.

SO HOW DID WE GET HERE?

Funnily enough, TripleWhale, a data and attribution platform that you may already be familiar with, shared this in their newsletter recently which I found incredibly relatable:

There is one change I would make, though. Instead of IQ at the bottom, I would replace that with time. The more time you spend thinking about attribution, the more this trend presents itself. I was very much in the camp of ‘attribution is important and we can’t base any decision off of anything but this data’ after learning about what it was, how it worked, why it was important, etc.

‘Attribution’ presents itself as a very solvable problem at a glance - it’s just numbers, and being able to connect one point to another. Sure, there are barriers in the way, but the concept is simple enough. “I spent $10 on this ad. Was one of the sales on my website from this?”. Unfortunately for all of us, the solution isn’t as simple as it sounds.

I’ve always been healthily skeptical of most things, which is something that I would encourage everyone to practice to some varying degree. It helps embrace new ideas and concepts, and learn and pivot quickly. This certainly helped me become detached from the idea that attribution ‘matters’, but it wasn’t the catalyst. That, instead, was an experiment I conducted recently.

THE EXPERIMENT

The experiment was simple: Let’s have multiple data analytics platforms report on a single metric and see the spread. Technically, they should all be the same, unless there is some sort of attribution system built into that which does some sort of prediction, projection, algorithmic machine-learning data manipulation, blah blah blah. Fundamentally, without intervention, these platforms should be able to report back the same number.

Let’s make it even simpler: We will only use the metric at the very top of the ‘metric cliff’ to give these platforms less room for error.

If you aren’t familiar with ‘The Cliff’ concept, it’s because I made it up, but you can learn more about it here if you’d like. To save you some time, this effectively describes metrics along a customer journey where attribution issues first take hold. Data that exists on-platform (ie. on Instagram or Facebook) can be easily tracked by Meta because they retain that data. Once someone clicks a link, they are brought off-platform where their pixel has to pick up the data, and that’s where the trouble begins and data falls off (the waterfall).

So, the metric that sits exactly after leaving the platform and getting to the intended landing page: website page views. Every additional step along the way can become increasingly more difficult to track, therefore losing more data along the way.

By judging the platforms on a website visit, we are giving these analytics platforms as much off-platform data as possible, decreasing the room for error.

This is where it gets interesting, though. If all of these platforms are in the business of reporting data accurately, then the methods they use should be reasonably similar, and the end result should be the same. Except they weren’t. Not even close.

Looking at my source website for this experiment, here’s what these platforms reported for website page visits (or Page Views) over a 14-day period (from June 1st - June 14th):

  • Google Analytics: 2,804

  • Heap Analytics: 2,869

  • HubSpot Analytics: 1,469

  • Facebook Pixel: 4,207

If we assume that Google Analytics is our ‘gold standard’ and therefore baseline, let’s take a look at how far off the other platforms were:

  • Heap Analytics: 2.3% over, not bad

  • HubSpot Analytics: -47% under, terrible

  • Facebook Pixel: 50% over, where do I even begin with that…

It’s pretty clear to see that nobody knows exactly how many page views I got during that timeframe, though you could reasonably say that Google and Heap agree for the most part. Therein lies the problem, though:

If none of these numbers match then how could we possibly ever trust any platform to report the correct data?

If we are using analytics platforms, it’s because we are searching for data that can tell a story that guides our decision-making, to help us problem solve. If the data we are receiving isn’t correct, how could our decisions possibly be set up for success?

That is where the conclusion surrounding attribution comes into play.

THE DATA SETS FEEDING ATTRIBUTION MODELS ARE BAD

Every attribution platform you use has two fundamental problems with it, and starting from the top it’s that the data they use in their models is bad. If the data you’re inputting to your system can’t be trusted, how can the output be trusted?

The second problem is that most of these platforms use some sort of proprietary algorithm or machine learning model to reasonably predict what the final numbers will be. This isn’t necessarily be a bad thing on its own, because mathematics can be an incredibly powerful tool for building accurate predictive models, however the problem is in the words themselves - predictive.

These algorithmic/machine learning models present another two problems within themselves: Again, they are based on data that is bad, and therefore predict numbers that are based on inaccurate numbers, leading to an output that is more inaccurate than otherwise. The second problem is that even if the data going into them was correct, any variation at all results in an output that is inaccurate, as no model can 100% accurately predict the outcome, and therefore is still outputting data is effectively a guess.

If we are using attribution platforms to make decisions, and none of them can agree, then how can we possibly use the information being provided to make the right decisions? Well, the answer to that is a sort of silver lining here:

“MAKE DATA-INFORMED DECISIONS, NOT DATA-DRIVEN ONES”

Perhaps the title of this newsletter was sensationalized, but I hope it helped you get this far. The quote above is something that I’ve been preaching a lot lately, and is something that I am becoming more and more invested in as time goes on.

If none of the data that we are looking at is 100% accurate, then we can not be making data-driven decisions. Data-driven decisions based on bad data are bad decisions. Instead, I encourage everyone to make data-informed decisions. Data-informed decisions can take the data as it is, an estimation, and compound it with experience, knowledge, and tested strategies to result in something much more effective.

The equation now changes from 'CTR is X% therefore I will do Y' to something more like 'This series of metrics are telling this X story, so I will choose Y strategy from a series of possible approaches to this based on the context of the data presented'.

This is where, I think, these attribution platforms can shine. Instead of using them as a ‘single source of truth’, using of them as a platform that can provide general direction will yield the best results, in my opinion.

If you can take that data, understand the inherent flaws in it, and make an educated decision that is informed by the data, but not exclusively molded to it, you can win.

Attribution may not be perfect nor accurate, but it can help inform your decisions when producing an educated, considered strategy.

MY CLIENTS WANT ROAS NOT CONSPIRACY THEORIES

I know, this concept is great in theory, but how does it apply practically when you have clients who are micro-monitoring your ROAS on a weekly basis? Surely you can’t just tell them that ROAS doesn’t matter because it’s based on false data. Your clients need some way to judge your performance, and if it’s not the ol’ trusty ROAS, then what is it?

The answer is MER, or Marketing Efficiency Ratio. Effectively, this measures all ad spend against all revenue. There may be some nuances to ‘all revenue’ here on a client-by-client basis, but measuring success on MER is your path to clear results. Two numbers that do not lie are in-platform amounts spent, and on-platform revenues generated. If you can set a goal based on MER, you can provide your clients with full transparency, and have a number you can confidently rely on to not lie to you. Everything else is noise.

There are variations of MER too, where you only use prospecting ad spend against revenue from net-new customers, etc. In this way, you can get fairly granular with it. At the end of the day, though, the goal is to be as high level as possible — Am I still profitable after spending on ads? MER can tell you that story, and you don’t need to worry about attribution loss.

THANK YOU

If you’ve made it this far, thank you. I am testing out a new format that I am calling “Thinking Out Loud” where I speak my mind, transparently, about topics I have been thinking about recently. It’s possible that what I am saying doesn’t make sense, or that it inspires others to rethink what they know about marketing. The goal is to be open in hopes that more information will come to me as I put it out.

Never stop learning, never stop sharing.

Cory

Join the conversation

or to participate.