Tag Auditing Best Practices for High-Quality Analytics Data

September 22, 2017 Matthew Maddox

It starts like this: data stakeholders get excited about garnering actionable data for informed decision-making. So they decide to accelerate their data collection, adding more complex configuration to their existing tools or adopting more technologies into their implementation.

But one day their implementation goes down. What happens then? Either the data stakeholders don’t realize the failure and make bad decisions (including automated decisions), or they’re left in a mad fix trying to figure out what happened. All the while, precious data slips through their fingers.

Failing to implement a data governance program is dangerous. Best practice is to continually validate your implementation—the only way to make sure that not just one report, but every report, is correct.

Here are some specific tag auditing best practices you can apply to successfully verify your data is always right:

Building a Tag Auditing Culture

1. Build a culture of stewardship and accountability over tag implementation and auditing.

Somebody needs to be responsible for ensuring your analytics and marketing tags are continuously validated. This somebody is not the summer intern. Choose someone that:

  • Has a strong technical background
  • Understands the business value of analytics
  • Has some seniority to implement processes
  • Is data-oriented

Learn more:  Data Governance Councils: The Web Analytics Governing Body.

2. Establish a reporting process for when adjustments need to be made.

Anything that can happen will happen. Something will go wrong. You can thank Murphy for that.

Make sure you have protocol in place so that whoever notices a problem (oftentimes an analyst) can easily communicate that problem to the person who can fix it (usually a developer).

If data is not being sent correctly, then whoever has the power to make the change needs to know right away.

Learn more:  ObservePoint API Use Cases

3. Create and maintain data governance documentation, often known as a solution design reference.

A solution design reference (SDR) documents the variables a company deploys on their website or app, along with basic explanations, formatting conventions, and expected values. An SDR is a blueprint for how all of your web and mobile technologies should be deployed, and helps keep everyone on the same page.

So if you don’t have an SDR, please make one.

Learn more:  7 Steps to Set Up Your Web Analytics Solution Design

Performing Periodic Audits

4. Perform daily audits on your top pages.

The frequency and size of your audits will depend on your objectives. Performing a daily audit on your top 1,000 pages will help you quickly identify problems where they matter most.

In addition, running a daily or weekly audit on your external campaigns will help verify your campaign codes are attributing accurately.

Learn more:  How Often Should I Perform a Tag Audit?

5. Perform frequent partial audits (weekly or monthly) to identify systemic problems.

If your website is template-based, most of the problems that occur will be systemic instead of specific to individual pages. As a result, it’s not always necessary to perform a comprehensive audit of your site, which could be an extremely long process.

Learn more:  Determining the Optimal Size for a Website Audit

Audits During Development

When you’re working on releasing a new website, make sure that you perform intensive audits during all phases of development and production.

6. Perform quality assurance audits during your development cycle.

During your development cycles, you want to verify that each technology is reporting as expected. The development cycle is a dynamic process, so things can break.

Establish a baseline, and compare your QA audits against that baseline.

Learn more:  Proactive Data Governance: Catching Errors in Development

7. Perform a comprehensive discovery audit before release.

Before deployment, make sure to perform a full audit to test for functionality. When scheduling this discovery audit, make sure to plan out sufficient time to make any necessary repairs before your scheduled release date.

If you don’t identify errors before deployment, you risk losing data for as long as you remain unaware of the problem, plus the amount of time required to fix it. An ounce of prevention goes a long way.

If you have an exceptionally large site, consider performing a 10,000-page audit. From our experience, we’ve seen that a 10,000-page audit gives you as much of a gauge on site performance as a 100,000-page audit.

8. Frequently audit your site in the production environment with rules.

Audit your site immediately after release and regularly afterward. Changes occur over time, and a regular audit will assure that you know about data discrepancies as they arise.

Instead of having to comb through reports every time you perform an audit, set up rules to validate against your network requests and alert you of any failures.

Learn more:  Using ObservePoint’s Rules Engine for Robust Tag Validation

Applying Tag Auditing Best Practices Over Time

Collecting correct, compliant data is a journey, not a destination. It is a continuous process that applies to dynamic data collection, not just an individual report. By applying tag auditing best practices, you create a reliable dataset from which you can then produce accurate reports. Doing this will enhance your confidence in your implementation and facilitate good decision-making processes.


About the Author

Matthew Maddox

Matt’s mission as the Director of Enablement at ObservePoint is to educate customers to use the marketing technologies they select for their sites most effectively. Matt delivered training at Omniture and Adobe for over eight years before joining ObservePoint. He was the dedicated trainer for several global companies, creating and delivering custom courses based on their corporate business requirements. With a wealth of experience solving analytics questions in many industry verticals, including e-commerce, media, finance, lead generation and automotive, Matt offers sound direction and analytics insight.

LinkedIn More Content by Matthew Maddox
Previous Article
Data Protection Compliance [Recap]
Data Protection Compliance [Recap]

This article summarizes a presentation by Clint Eagar and Ted Sfikas, talking about the need to prepare for...

Next Article
Rule Your Tag Validation [Recap]
Rule Your Tag Validation [Recap]

This article discusses the importance of using rules to automate data governance, as explained at Validate ...

Get a free 14-day trial with ObservePoint

Start Your Trial