Data Quality Management Process Part 3: Data Quality Analysis

November 7, 2014 Brad Perry

background image of welding metal with the words

In Data Quality Management Process Part 2, I covered logical segmentation techniques and technical configuration inside ObservePoint. After the results are generated, the final step is to analyze the audit and simulation data, make data quality improvement recommendations, and confirm changes. This post guide you through the final step in the data quality management process.

ObservePoint - Data Quality Mgmt Process

Early Phase Clean-up

In the beginning of your data quality management journey, you’ll spend some time cleaning things up. It’s a necessary task and you’ll start to see immediate benefits. In the beginning, your data collection trigger is a poor data quality condition. So in this case the most important question to answer here is where do I start?

Use Solution Design Requirements

An up-to-date SDR is the best resource you can have when cleaning up tag deployments. The idea here is that an ObservePoint scan generates data about tag deployment including variables and values, and if you know what the design requirements are, you can compare them to what tags are actually doing on the site. In addition to the four basics of tag deployment, the variable summary report becomes your best asset here. You’ll want to look at:

    • Page Names
    • Channel Parameters

 

  • Campaign Paramaters
  • Other custom variables according to SDR

 

Use ObservePoint Best Practices

ObservePoint reports on several best practices that apply to any tag deployment. Checking vendor compliance, for example, identifies specific instances of data loss due to syntax problems with the data that is passed in tag. User experience reports – although not directly related to the tag, can help you improve data quality by identifying issues that can unintentionally influence user behavior in testing situations.

Proactive Data Quality Management

If you’re in this phase, first of all congratulations. You’ve got a handle on your data quality in a way that few have managed.

At this point, you’re quite agile about responding to business needs to measure, manage, and improve data quality in near real time. Instead of relying on the SDR, you can update the SDR and reference past audit results to assess what has changed from previous audit results. Data quality measurement is happening three ways: regularly scheduled audits and simulations in established core content areas, ad-hoc audits in hotspots, and simulations in time-limited internal campaigns.

Regularly Scheduled Audits and Simulations

This is a continuation of what was set up in early-phase clean-up. Now that data quality has stabilized in core content areas, you’ll run regularly scheduled audits and simulations in these areas to monitor data quality over time. If data quality issues creep in during a code release or architecture change, you’ll know about it. Core high value web content is monitored by simulations – key tags and variables are verified on a regular basis and you’re notified if something changes. All of this is a passive, background process that requires very low cognitive overhead so you can concentrate on active processes.

Ad-hoc Audits in Hot Spots

With data quality in core content areas being monitored in the background, you can effectively address hot spots in current business conditions. Run tests on site redesigns, development screw-ups, new deployments. Compare against and update SDRs. The agency delivers work, you test, and verify.

Time-limited Internal Campaigns

When you’ve produced results and shown the value of data quality management, stakeholders from other parts of your organization will likely want your help in improving their data quality as well. For example, the ad team might feel the pain of running a campaign without tracking code. Or the video team might need help with the quality of their engagement metrics in their players.

Hey – looks like your visibility has improved.

Conclusion

The Data Quality Management Process is a fundamental process to digital marketing – after all, data is the lifeblood of everything we do. This goes beyond marketing too, as companies do bolder things like state web metrics in their annual shareholder reports, base valuations on their online audience size and engagement, and depend entirely on pageviews and time on page for their revenue model, accuracy of data is key to online business. That’s why our entire business at ObservePoint is about enabling digital marketers with tools to measure, monitor and improve the quality of their data and we’re not just providing great tools – we’re leading the way with services, training, thought leadership, and innovative solutions to solve the problems of the future.

Have data quality questions? Get in touch with us for a free tag audit and data quality consultation.

 

About the Author

Brad Perry

Brad Perry has been Director of Demand Generation at ObservePoint since June 2015. He is, in his own words, “unhealthily addicted to driving marketing success” and has demonstrated his unrelenting passion for marketing in various verticals. His areas of expertise include demand generation, marketing operations & system design, marketing automation, email campaign management, content strategy, multi-stage lead nurturing and website optimization.

LinkedIn More Content by Brad Perry
Previous Article
Audits and Journeys: What’s the Difference?
Audits and Journeys: What’s the Difference?

This article talks about the differences between an audit and a simulation. Learn the unique role of each a...

Next Article
Data Quality Management Process Part 2: Segmentation Techniques
Data Quality Management Process Part 2: Segmentation Techniques

This article focuses on segmentation techniques and three steps to building your audit segmentation/simulat...