Confirming Complex Integrations

November 4, 2014 Brad Perry

Confirming complex integrations header with a microchip image in the background

Best in class digital analysts have become accustomed to complex, dynamic tag deployments that populate variables in specific ways based on how web pages are accessed. Testing the proper functionality of this tooling has historically been a laborious process requiring specialized browser-based tools, careful interpretation of raw data, and manual logging in purpose-built excel files.

Advantages of DQM

One advantage offered by a robust data quality management platform is that much of this process can be automated through rules and regression testing. The verification of complex integrations can be a background process that frees up digital teams to be more responsive to business needs.

In the white paper Data Quality and the Digital World, Tim Munsell, lead Web Analyst at DaveRamsey.com shared his experience. “In order to control Adobe-related costs, we limit the number of Vista rules and Discover licenses we have active at any time, but we still want to enable complex reporting on paths, tracking codes, and the like for our business analysts. One of our strategies to this end is to be very selective about what data is sent to each of our report suites. Given the complexity of this task, ObservePoint is critical to our ongoing success, constantly monitoring that the right data is being populated into Adobe’s data collectors.”

Munsell continued, “Instead of having to pay for additional data collection, we are able to execute on a logical strategy that relies on a small number of events and eVars which simultaneously save us time and better facilitate the analysis process.”

The bottom line: An organization needs to confirm its complex integrations or they won’t be getting the most value from their sites’ tags. It makes no sense to have these integrations on a website if an organization isn’t going to deploy a robust data quality management system to monitor them.


This post is based on the whitepaper Data Quality and the Digital World by Eric T. Peterson, Principal consultant at Web Analytics Demystified.

 

About the Author

Brad Perry

Brad Perry has been Director of Demand Generation at ObservePoint since June 2015. He is, in his own words, “unhealthily addicted to driving marketing success” and has demonstrated his unrelenting passion for marketing in various verticals. His areas of expertise include demand generation, marketing operations & system design, marketing automation, email campaign management, content strategy, multi-stage lead nurturing and website optimization.

LinkedIn More Content by Brad Perry
Previous Article
Leveraging Audits for Documentation
Leveraging Audits for Documentation

This article discusses the data quality management best practice of having a quarterly review of documents....

Next Article
Data Quality Management Process Part 2: Segmentation Techniques
Data Quality Management Process Part 2: Segmentation Techniques

This article focuses on segmentation techniques and three steps to building your audit segmentation/simulat...