Data Protection Plan

Adobe Summit in London, May 2016

Hours ago I wrapped up my latest speaking engagement: a co-presentation with Adobe’s Jan Exner. Jan and I have been working from two different ends of the same problem: the need for better structure beneath analytics implementations. For years he’s been pushing the idea on the development side, while I’ve been preaching it from the marketing/analytics side.
In our Summit session, Jan advised developers to set up framework-level tests, verifying on each site page the inclusion of a Tag Management System (TMS) and a Data Layer. My message for the analytics folks concerned user-level tests and the validation of reporting data. Once Adobe releases session recordings, I’ll have them for you; for now, a few highlights.

Data Gaps

Here are the graphs Jan and I led with, both showing obvious data quality gaps. The collective groan that rose from the audience when they saw these pictures told me that they had experience with the gut punch that such reports deliver.
Data Gaps - Real Life Examples
It didn’t matter what the metrics were…all anybody had to see was the big chunk of obviously missing data in the middle of each graph. That’s the point when executives ask: “what happened here?” and we all know the answer is NOT about the customer, but rather about the tracking. When something falls off the map suddenly and completely, it almost always means that, for one reason or another, the analytics system stopped getting the signals it needs from the website.
Analytics professionals hate gaps like these. They hurt our credibility in the business, and they make stakeholders nervous about venturing forward into the data paths we’ve laid out. It’s like having a troll living under the bridge you built. It doesn’t matter how quickly you put things right after an “incident”; every traveler to come after will be constantly second-guessing what the data portends, timidly testing each wooden slat before trusting it with their full weight, and terrified of what dangers might lurk beneath even the safest of assumptions.
To get a sense of the scope of the data quality problem for these professionals, Jan asked for a show of hands of those spending 10% of their time on data quality issues, and virtually every hand shot up. Though most of the hands had dropped by the time he got above 30%, even 10% constitutes a considerable portion of our lives spent on something we’d rather not have to do in the first place.

Wrap Up

We concluded by sharing some code that developers and marketers alike can use to validate the analytics implementations across their sites at any time, or, even better, on a schedule that runs daily across all of the properties that need tracking. Consistently applying these tests across both production and development environments is vital if we’re to avoid the data gap cycle. And my advice came with this cherry on top: as you verify each component, you’ll be even more effective if you feed the results of that audit back into Adobe as another real-time metric. In a coming post, I’ll take my time extolling each and every virtue of this best practice (i.e. feeding your audit results back into your analytics reports), but for now let me simply say that it will provide a way for you to banish the data-damaging trolls now and forever, all the while reinforcing the confidence your stakeholders need to have in the data that they’re basing their decisions upon every day.
Jan Exner and Craig Scribner co-presenting in London 2016

Get Started

Find out why hundreds of customers use Claravine

Back to Top