Audit data collection once, twice, three times

October 16, 2014 Brad Perry

Web sites are fluid channels where content is constantly added, changed, and retired. This presents unique measurement challenges for digital marketers. Keeping up with these changes can be a huge headache and just about anyone with field experience has run up against new or changed content with incorrectly deployed tags. This is where enterprise auditing tools and real data quality management strategy pays off.

Tim Munsell, lead web analyst at and early ObservePoint early adopter offers this insight:

“We have 40 or 50 developers dedicated to the web site and we roll code twice a week. [Before ObservePoint], we didn’t have an easy process to validate page tagging on such a frequent basis. I needed to hand check new or altered pages with each rollout, which could easily lead to missed errors in the tags.”

For, ObservePoint is now central to an efficient process that continually validates code during the development and deployment processes.

A threefold auditing strategy is prescribed for all new content. Ideally, you are able to pass data collected during the development and staging environment – either with your TMS or your primary web analytics and other digital marketing tools using special testing accounts, just in case data collection doesn’t go exactly as planned.

Test frequently during the development process

Run tag audits multiple times during any large-scale deployment effort. This instills resources throughout your company with data collection best practices.

When appropriate, share results with developers and project managers, especially if you are able to go into detail regarding what is working and what needs additional attention. In effect, you are subtly training internal resources to think about the data collection process as part of larger design efforts. Consider making tag auditing results part of each regular build meeting. If you can do this, you’ll be well on your way to creating a culture where trustworthy data is the rule, not the exception.

Test often during quality assurance and testing phases

Closely monitor the data being passed to your data measurement systems after your development project has passed into your QA environment. If your company is particularly good at testing, the combination of a tag audit and a regular review of key variables being populated should give you a reasonably good sense of what “real world” results should look like.

Test immediately following deployment to confirm data is being collected as planned

Confirm that data collection has successfully migrated from the development and staging environments to the production environment. Most companies should be used to completing this step. When problems are identified in this phase, act quickly to correct them. Assign resources to post-deployment efforts tasked with ensuring that data collection is correct, clearing bugs, and that reporting from new content will validate its value.

The Pay-off

The deployment process is not considered complete until all aspects of the code are deemed functional in a production environment including data collection code. Commitment to data accuracy and utility within the broader enterprise means asterisks next to numbers (indicating additional context is necessary, an explanation about why data collection challenges existed when the code was deployed live) are both unwanted and seldom seen.

This post is based on the white paper Data Quality and the Digital World by Eric T. Peterson, Principal consultant at Web Analytics Demystified.


About the Author

Brad Perry

Brad Perry has been Director of Demand Generation at ObservePoint since June 2015. He is, in his own words, “unhealthily addicted to driving marketing success” and has demonstrated his unrelenting passion for marketing in various verticals. His areas of expertise include demand generation, marketing operations & system design, marketing automation, email campaign management, content strategy, multi-stage lead nurturing and website optimization.

Previous Article
Set Proactive Alerts
Set Proactive Alerts

This article points out the reality of evolving standards, and how we need to be proactive, powerful and pr...

Next Article
How good rhythm improves data quality
How good rhythm improves data quality

This article discusses why developing a rhythm when conducting a tag audit is essential to making data vali...

Get a free 14-day trial with ObservePoint

Start Your Trial