Measure Twice, Cut Once: Debunked

August 18, 2014 Brad Perry

a green street sign that reads

Earlier this year, we audited the Internet Retailer top 500 and second 500. We found that 47% of top sites and 26% of second sites deploy more than one web analytics tool. This is an ongoing trend that’s sparked quite a bit of controversy.

Proponents of this method claim that having a “backup” data set is important in case the primary web analytics tool fails to collect data. Others claim that it helps to confirm the validity of data. It all sounds good, but what are the risks?

The Danger of Over-Deploying Tags

In the whitepaper Data Quality and the Digital World, Eric Peterson posits that this common practice is far from the best practice. While the reasons for [double-tagging] vary, we often hear clients rationalize: ‘we don’t trust Solution A so we also deployed Solution B for confirmation.’ This, in our opinion, is a mistake.

In reality, multiple measurements compromise overall data quality, cause confusion, and create unnecessary work. Consider the following example:

It’s Like Counting Kids at a Picnic

Imagine you’re at a large family picnic. Finally, it’s time to bring out Grandpa’s famous homemade ice cream. You count the number of kids at the picnic to get a head start in dishing it out.

You count 30 children, but it seems like there are more, so you ask Aunt Doris to count as well. Aunt Doris counts 33. Now you have two answers, and a 10% margin of error. What do you do? Split the difference? Risk someone missing out? Or is it better to put out too much and potentially waste some?

Chances are, you’ll count again. And then out of expedience, you’ll trust the final number, whatever it is, consequences be damned. Chances are, things will be alright…right?

In business, we’re not always so generous. Well-intentioned efforts to get the truth with more data ends just causing analysis paralysis. Even if data agrees, what are the chances that both could be wrong?

On average, 1 in 30 pages is missing any single tag. When similar data from two sources is compared, you double the probability of bad data.

It’s Not a Perfect Science

Like children in the park, website visitors are literally running all over the place. It’s better to have just one highly accurate, audited system of record.

In his white paper, Eric Peterson recommends investing in one good web analytics platform and pairing it with an auditing solution like ObservePoint. This eliminates much of the tedious manual testing that’s traditionally associated with tagging QA. What some call data janitor work, we call data quality management. Ensuring that data is clean and consistent is essential to realizing return on digital marketing tools, work, and spends.

We’ve made data quality management straightforward. Get a free audit to see for yourself.

 

About the Author

Brad Perry

Brad Perry has been Director of Demand Generation at ObservePoint since June 2015. He is, in his own words, “unhealthily addicted to driving marketing success” and has demonstrated his unrelenting passion for marketing in various verticals. His areas of expertise include demand generation, marketing operations & system design, marketing automation, email campaign management, content strategy, multi-stage lead nurturing and website optimization.

LinkedIn More Content by Brad Perry
Previous Article
Four Basics of Tag Deployment
Four Basics of Tag Deployment

This article explains how basic elements of data quality management can be a challenge for every company. L...

Next Article
The Case of the Tags That Said Too Much
The Case of the Tags That Said Too Much

This article expounds on the relationship between large tags and bad data. Learn why these tags can cause b...