This time, Bring Good News

In the 2017 world of IT and systems engineering, Test-driven development (TDD) is the new mantra. No one writes a line of code these days without the intent to have that code check/test itself. If there is a bug in that code, it gets caught and fixed before it goes live, reducing any risk of breakage.
This kind of system has not traditionally been employed in analytics, contributing to a perception among some marketers and executives that marketing analytics isn’t always a reliable enterprise. Bringing a more rigorous approach to analytics could change that. A TDD method would require that whenever you make any change to your analytics, you make sure the change is fully tested — and the change is recorded — before it’s deployed. This method takes more time, and may frustrate management, but will result in better quality control.
Let’s use an example. There are a number of good products on the market (Observepoint, Hubscan, QA2L) that crawl an organization’s website, testing for broken links or missing tracking data, and alerting the team when data tags need to be fixed.

Most of these tools function like an alarm system. If the tool finds that some website tags aren’t firing, boom, the siren goes off. An analyst goes in and makes a change, and the tool goes back to humming normally. Everything’s good. The alarm is silenced.
But once the fix is made, what happens to the resulting data? In almost all cases, there will be no permanent record of what was broken, or how long it was out of commission, and what fix was needed.There will be no record of who made the fix, and no clear record that the data — for some unknown period of time — was compromised.
If you’re an ecommerce company, that is a problem. An executive looking at a data report may see no visits to a given campaign page for three months. She may not know that for some period of time the tags on that page weren’t firing. Instead, what she sees is a chart showing some level of traffic, then nothing, then more traffic. She may make some costly assumptions.
She may conclude that some messaging was ineffective (whatever ran during the gap when data was lost) and other messaging was superior (seen once the problem was fixed). Or maybe she believes she’s found evidence for seasonal fluctuation in customer response. Whatever creative insights she plans to act on, because they’re based on a lie, they’re unlikely to be good for business.
What if the analyst who responded to the original Broken Tag Alarm from Observepoint, instead of simply going in and making the needed fixes, had fed the scan report back into the analytics system itself? Right away the system creates an evidence trail. Anyone who pulled data from the Analytics system for that time period would see that for some interval there was an issue with broken tags, and that it was fixed on X date. It would show who made the fix, and the impact of both the break and the fix on the overall data picture.
Over time, it would create a trend line, demonstrating real changes in the organization’s level of confidence for its tracking data.
Looking at her reports, the executive will notice new detail points. She’ll still see that no visits were recorded to a certain page for some period of time, but she’ll also see that there is only 60% confidence in the data during that time, due to broken links. She may now notice that the traffic number is significantly lower for some period after the breakage than it was before. Now she can infer, correctly, that the campaign messaging that followed the break was not responsible for a jump in traffic…in fact, it may actually be associated with a decline, since the numbers were higher before the breakage period.
Such a system would create a more coherent data picture, and would allow analysts to graph the contribution they make in the organization. If an executive could see that you were responsible for taking overall data confidence from 60% to 90% during a two-year period, and she could see instances when you caught and fixed bugs, restoring higher confidence after human error crept in, wouldn’t that go some way toward demonstrating your value to the company’s bottom line? Wouldn’t it be great to bring good news for a change? And wouldn’t that demonstrate the value of the analyst role more broadly? We think so, and we’re here to help.

Get Started

Find out why hundreds of customers use Claravine

Back to Top