Two of the most important questions data quality analysts ask ourselves is what to audit and when to audit.
In past blog posts, we’ve covered some tactical methods such as auditing cadence, creating alerts, qualifications of a data quality analyst, and characteristics of a good deployment. This three-part series builds on these foundational concepts and ties them together into the data quality management process: a strategic framework that you can use to add value to your business through data quality improvement.
Understanding the Tools
As an ObservePoint user, it helps to clearly understand the difference between audits and simulations.
A Tag Audit is a periodic scan of a particular segment of a web site. Audits should be segmented by content area or type, technology, or architecture. Audits can be run on both regular, calendared basis and on a one-off basis.
The data generated by an audit is moment-in-time view of the included pages and tag deployment of specific content under specific conditions at a specific time. Tag audits can collect data for OnPageLoad and OnClick tags, and track some site performance data such as page load time and HTTP status codes.
Simulations are targeted, high-frequency tests of specific content under specific conditions. They differ from audits in that you must specify exactly what content to load and what conditions to simulate on each test. When any parameter fails to verify, the system generates an alert. Simulations are best used on content that is proximate to reportable metrics – attribution, ROI, KPIs. Since a majority of operational data relies on some key content, you’ll be informed if and when tracking performance is degraded.
The Data Quality Management Process
Data Quality Management (DQM) combines the tools, strategy, and processes required to measure, monitor, and improve the quality of digital marketing data – and in effect, its utility and value in the enterprise.
The key to improving data quality for digital marketing tools is recognizing the need for active data quality management. This means applying proper focus and segmentation techniques to target the correct results, performing prescriptive analysis, making recommendations, then realizing and reporting the improvements. Although few people may pay tribute to data quality for data quality’s sake, attention to this process will keep your organization on the solid foundation of reality.
Forming a Problem Statement
The Data Quality Management Process will help you create a problem statement that encompasses when, what, and why you are conducting a data quality investigation. Following the diagram from left to right, you should eventually be able to form problem statements that look like this:
“Because our web team rolled out a new home page and primary content area for our corporate brand, we need to audit this section of the web site to ensure our Tag Management System, Primary Analytics System of Record and other essential tags have been successfully migrated to the production environment without any errors that will cause data corruption.”
“In anticipation for an updated dynamic booking flow that uses [insert tool name] to recommend upsells in step two based on input in step 1 and data from [insert tool name], we need to confirm that: 1) on-click variables are firing correctly in step 1 when specific paramaters are chosen; 2) tags from [tools 1, 2, 3, and 4] fire correctly OnClick; 3) Confirm that the “test” creative ad is served in Step 2; and 4) [tags 1, 2, 3, 4] fire OnClick to track the conversion.”
Data Collection Trigger as a Signal of When to Audit
Data collection triggers are your signal that an investigation into a marketing channel should be conducted. Triggers come in two types, as each type will inform your segmentation technique in the next step.
A business condition is an external trigger – something that has happened that necessitates an audit or simulation. For example, the web site was redesigned, so you need to confirm that all tags made it through the migration process. An external trigger could also be a question about data quality, and a need to confirm that a deployment is correct.
An internal campaign is an internal trigger – deploying a new tag, running an email / ad / testing campaign, or changing an architecture. An internal trigger gives you the opportunity to have proactive conversations about data quality. Many internal campaign triggers will call for the use of simulations to actively monitor data quality on specific metric-proximate content.
About the AuthorLinkedIn