A 5-Step Web Analytics Testing Framework

October 14, 2021 Michael Fong

Web analytics solutions are powerful tools for determining if your marketing efforts are reaching and engaging the right audience. With web analytics data at your disposal, you can optimize your campaigns and make important business decisions. 

But web analytics tracking is vulnerable to errors whenever someone makes a change to your site or implementation. If you have many digital properties or a very complex one, it’s more than likely that you have duplicate, unauthorized, or missing web analytics tags on your site.  But when you have millions of instances of analytics and marketing tags on your website, where do you even begin to cleanse the data?

To protect the integrity of that data, you need to have the right solutions and processes in place to govern your web analytics. Then you can trust your data, realize ROI on technology spend, and do it all more efficiently than if you were to spot check things manually. 

The following framework encompasses the essential components of an effective web analytics testing program.


Plan Your Implementation

Companies often start installing tags and measuring KPIs without having a comprehensive plan. Planning is an important first step of a web analytics testing framework because you need to know what the objectives are before you can test if they are being met.

Implementing tracking without a plan can also lead to all sorts of legacy data that people have forgotten the reason for collecting. Tagging plans or solution documents help you avoid data overload and enable documentation around what you’re collecting. Tagging plans are especially important to get everyone on the same page when there are multiple teams working together like marketing, analytics, and development. 

If you need to come up with a new tagging plan, you can use solutions like ObservePoint’s Technology Governance to scan your site to find out exactly what data is currently collected. Then you can evaluate what’s expendable, repetitive, or outdated and come up with a plan that maps back to your business goals.


Deploy or Update Tracking

The tagging plan or solution document should make its way to the developers or whoever is responsible for actually implementing the tags. When a dev team is equipped with a tagging plan, they can code the tags and data layer variables to the specifications of the plan as they are building the site. Once the tracking is deployed, it’ll be ready for testing. 


Test in Staging

Finding problems in a staging environment is more ideal and less expensive than finding errors when a site or update is already live. As the last line of defense before production, testing at this point should be iterative and conducted under different conditions to be as thorough as possible. 

Tag performance should be tested against the tagging plan requirements to make sure that when an action is taken, such as a button being clicked, the measurement also occurs. Analytics testing processes should be integrated with QA processes that correspond with website updates since any update can affect tracking, not just tracking updates. 


Validate in Production

Once you’ve published a new release to the production environment, make sure that nothing broke after it went live. Tagging errors are common at the moment of publication because of human error or unforeseen downstream results to changes in code. Even with thorough testing in the staging environment and the best communication between teams, it’s critical to test your analytics implementation after site updates are live to confirm your tracking is still functioning correctly.


Monitor in Production

Continuing to monitor analytics data at regular intervals will prevent costly surprises. Tagging errors occur on live sites frequently for many reasons, including updates that overwrite code, piggybacking tags that might show up from vendors, or just natural entropy. 

Most companies don’t have the resources to continually test and monitor web analytics implementations, so an automated solution like Technology Governance can conserve manpower and give you peace of mind. Set up regularly scheduled scans in Audits, monitor important conversion paths with Journeys, and create Rules and alerts to make sure you’re the first to know when errors crop up. 


Implementing the Web Analytics Testing Framework

As powerful as analytics and marketing technologies can be, they rely on correct implementation. Having an automated solution that tests and governs tags and analytics will allow you to trust the data you use to make these crucial business decisions.

If you’d like to see a sample test of the web analytics on your site, schedule a demo with an ObservePoint representative or submit a request for a pre-recorded video of an audit on your chosen domain.

About the Author

Michael Fong

Mike is the Sr. Manager of Product Go To Market at ObservePoint and assists in aligning the Product, Marketing, and Revenue teams on product strategy, value propositions, and promotion. Previously a Senior Consultant and Solutions Engineer on ObservePoint’s EMEA team in London, Mike has been integral in ensuring ObservePoint users are obtaining the highest quality of data from their marketing technologies. With over 10 years of experience in the analytics world, Mike is an expert when it comes to data analytics, SQL, problem solving, and spreading good vibes.

More Content by Michael Fong
Previous Article
What Is ObservePoint? Here's What You Need to Know
What Is ObservePoint? Here's What You Need to Know

ObservePoint is software to help companies ensure digital analytics data is accurate and actionable. Get an...

Next Article
What Are Piggybacking Tags and What Threats Do They Pose?
What Are Piggybacking Tags and What Threats Do They Pose?

Piggybacking tags are enough to keep the attuned marketer up at night. But what threat do they really pose?...

Get a free 14-day trial with ObservePoint

Start Your Trial