Why Poor Data Quality in Web Analytics Happens (and How to Avoid It)

February 7, 2018 Clint Eagar

Tape measure measuring a piece of wood

When Lord Kelvin said, “If you cannot measure it, you cannot improve it,” he probably wasn’t talking about poor data quality in web analytics. But he might as well have been.

Web analytics is about measurement and improvement—improving your traffic, web presence, and conversion rates, among other things.

However, your ability to capitalize on web analytics data is contingent upon the quality of the data you collect. Do you—marketer, executive, analyst or otherwise—want to make business decisions based on low-quality data from misconfigured analytics technologies?

Certainly not. You shudder at the thought. But nevertheless, poor data quality can happen. And here’s why.

What Causes Poor Data Quality

Errors in tagging within your website can skew your data and stifle your ability to properly analyze it.

How does this happen? It all starts with faulty data collection practices. It starts with those tracking tags you have to install on your website to make your analytics tool work. Here are some potential points of error:

  1. Lack of analytics documentation. When an organization doesn’t have standardized documentation for implementing tags, they risk each department—or even each individual—applying their own style of analytics tagging. The consequence: errors in data collection.
  2. Human error. People who install tags are not machines, they’re humans. Humans make mistakes, and this can result in misconfigured tags and bad data.
  3. Misunderstanding of tag management systems. The function of a TMS is to implement tags, not test them. Having a TMS doesn’t fully protect you from tag failure.

When tagging mistakes occur, analytical data may become infected with inaccuracy, resulting in data loss, data leakage, and one of the most heinous of crimes against analytics: data inflation.

Data Inflation

Data inflation occurs when a web page has duplicate tags collecting duplicate metrics. If you have the same tag twice, then your data for these metrics are going to read twice the actual value.

This could cause confusion concerning your digital presence, the effectiveness of campaigns or the true nature of user behavior.

And you—the analyst, marketer, or whoever you are—are left wondering at the integrity of your data. Or worse yet, you make critical business decisions based on inaccurate or inflated data.

Though your analytics tool does perfectly well at recording and reporting every instance of a tag, it does not inform you of duplicate tags. Unless you have someone at your side that can read JavaScript and has the time to look through all of your digital properties (not likely), you won’t know that your data is infected, and your “informed” decisions won’t be so informed.

Automated Tag Auditing and Governance Solutions

Listen to what our friends at Forrester have to say about the matter:

“Data quality services need to exist everywhere data processes, storage, interaction, and consumption occur. Upgrading your tools won’t suffice.” (Data Quality Market Overview: Trust Your Data To Succeed With Customers, Forrester Research, May 2015.)

To protect against bad data, you need to test your tags. But before you frantically run off to your IT department to start manually checking all of your digital properties—which would take way too long—take a deep breath and consider the following real-life example:

A global telecommunications company was in the process of rolling out a new website with verified analytics implementations. Before sunsetting the old version, an analyst looked at some of the numbers from both sites, only to find that the old site was outperforming the new one. Executives were ready to label the new version as a failure and to sunset it, but the analyst recommended that they wait and run an analytics audit.

To figure out what was really going on, this organization asked ObservePoint to audit the sites in question. Results of the tag audits—given to the executives within hours instead of the weeks it would have taken to perform a manual audit—showed that the old product pages had 300-500% data inflation caused by the presence of duplicate web tags.

The executives were about to sunset the new site even though it was actually performing much better than the old. Without the right information available, they were ready to make a major, revenue-impacting decision based on bad data.

Don’t make bad decisions based on incorrect data. Instead, give your data quality a boost with a tag audit. A free, automated audit of your website can help you quickly and efficiently identify and remedy potential data disasters—because if your data is wrong, it will be harmful rather than helpful.

 

About the Author

Clint Eagar

Clint gets things done. He has been building websites, marketing and optimizing them for 15 years. He claims to know a little bit about a lot of things and is relied on to execute anything quickly. Clint has been with ObservePoint since the early days and has helped support, test, and promote the product. Before coming to ObservePoint he was at OrangeSoda, running the enterprise SEO team, and before that he was a business consultant at Omniture.

LinkedIn More Content by Clint Eagar
Previous Article
Thinking about a Web Analytics Audit? Consider These 5 Questions
Thinking about a Web Analytics Audit? Consider These 5 Questions

Explore the need to perform a web analytics audit in order to support better data quality and enable leader...

Next Article
How to Use Adobe Target’s Category Affinity for Website Personalization
How to Use Adobe Target’s Category Affinity for Website Personalization

34% of US online adults say they prefer to shop with retailers who use personal information to improve thei...