Analytics professionals like you are responsible for fueling the controlled burn that is your analytics implementation. You’re constantly adding new tags, variables, metrics, and dimensions to keep your company warm with new insights and next steps.
Then a loose ember flies out of the flame. What do you do now?
In real life, spotting and putting out a loose ember from a fire is an almost automatic reaction.
Spotting errors in your analytics implementation, on the other hand, is a much more hands-on process. Errors aren’t always readily apparent, and the looming risk of missing an error and igniting long-term data quality problems is a serious concern.
Is there a solution to make spotting analytics errors more automatic so you can enjoy the warmth of the flame instead of running around stomping out loose embers?
Yes. The solution is automated analytics monitoring.
What is analytics monitoring?
Analytics monitoring is a technology-supported business practice that identifies and alerts you to analytics tracking errors on your website. Errors like broken analytics tags, missing variables, or inconsistent data layer values are all examples of issues that, though small at first, could ignite larger problems if not resolved quickly. Analytics monitoring mitigates these risks by spotting errors early and alerting you before the flames get out of control.
How does analytics monitoring work?
Analytics monitoring relies on automated solutions that periodically scan your live implementation for tracking errors. These scans take a couple forms, called Web Audits and Web Journeys.
Govern Tags Across Your Site with Web Audits
A Web Audit is a scan of a batch of pages that verifies data is being collected properly on those pages.
During the scan, the analytics monitoring solution will capture any network request firing from each page and identify the request based on its unique structure.
For example, the analytics monitoring solution can identify requests to www.google-analytics.com as being requests made by the Google Analytics script installed on the page. Being able to identify tags like Google Analytics is important because:
1. If you want Google Analytics to be present, the monitoring solution can verify that it’s always present on each page.
2. If you don’t want Google Analytics to be present, the monitoring solution can report if someone deployed Google Analytics without authorization.
Web Audits become even more useful when you combine them with data validation in the form of Rules. Rules are pre-defined requirements that check the web pages against the criteria you set up, validating that your analytics and marketing tags are collecting expected values.
For example, if you expect your Google Analytics tag to always pass an advertiser ID (a number) into custom dimension 11 (cd11), then you can set a Rule to make sure that cd11 always has a numeric value on every page.
Ultimately, the combination of Web Audits and Rules helps you to:
1. Use your analytics technology to its fullest.
2. Verify your data is accurate.
3. Speed up your website by removing unauthorized or outdated tags.
4. Comply with data protection initiatives (like GDPR and CCPA).
Test Critical Conversion Paths with Web Journeys
While Web Audits scan batches of pages, Web Journeys scan sequences of pages or actions on individual pages. Scanning sequences of actions is valuable for analytics monitoring because most implementations will have event-based tracking on each step of the site’s most critical conversion paths.
Real-Life Example of Event-Based Tracking
Let’s take an example from Verizon Wireless’s home page, where Verizon is offering a new deal for a Samsung Galaxy S10 5G. A basic conversion path and the associated tracking/analysis might look like this:
You get the idea. As hundreds of users follow a similar path, eventually an analyst at Verizon can use the collected data to compare the user behavior of this campaign with others and build analyses of the data. This allows the business to make important business decisions for their website, such as:
- How to structure their website
- How to design their campaigns
- Which products to promote on the home page
- Which cross sale products to promote
- And more
But what if, for example, the analytics implementation was broken and unable to collect accurate data along the path? For example, what if any time a user selected the color Crown Silver for their new smart phone, the analytics solution failed to fire? In this case, an analyst would get skewed data telling him that people only wanted Majestic Black phones and not Crown Silver. Sharing this faulty data with others would lead to faulty decisions and lower sales.
To avoid losing irrecoverable data and basing decisions on incomplete or wrong data, digital analysts can use analytics monitoring to simulate user activity along these conversion paths, notifying them of errors both on the User Action side as well as the Analytics Tracking side. Here are some examples:
Errors With Analytics Tracking
- An analytics tag doesn't fire
- An analytics tag is missing a variable
- An analytics tag is missing a value
- A variable value is formatted incorrectly
- An unauthorized technology is collecting data
- The data layer is missing a variable or value
Errors With User Actions
- The next step in the journey doesn't exist
- An input field won't accept input or is unavailable
- A checkbox or radio button is unavailable
- A button is unavailable
Using Web Journeys, digital analysts can repeatedly scan their most important conversion paths at a cadence as frequently as every 15 minutes, with alerts to notify them of any issues the scan encounters. Frequent scans build confidence and make it possible to catch errors before they destroy your data quality.
Quick Rewind: Why do errors happen in the first place?
Analytics errors are bound to happen at one point or another. But when they do happen, who is at fault?
Usually no one, specifically.
Errors aren’t so much the result of negligence on the part of any one person—the real issue with analytics implementations comes from the speed of website development.
Websites are dynamic entities. Multiple teams and roles help manage and grow the site, only one of which is the analytics role. These teams aren’t always perfectly in sync, and a change by the web development team can sometimes have an unintended yet fatal effect on the analytics team’s ability to consistently gather accurate data over time.
For example, depending on how digital analysts bind analytics tracking to elements on a web page, an update to website structure can completely throw off their tracking. And I’m not talking about fundamental redesigns—even just small changes here and there, like redesigning a button or tweaking page structure, can make accurate analytics tracking difficult.
One customer I trained suddenly lost analytics traffic to a group of product pages. For days, they would pull up the pages in the browser but couldn’t see that data reflected in their analytics tool. Finally, we found that some other department in the organization had made a small change in a marketing database that affected their data. The pages were still there, but the analytics data got lost due to no fault of their own.
An analytics monitoring solution can bridge the gap between development and analytics roles by giving the analytics team a notification system for anytime a change affects their ability to gather accurate data.
The Benefits of Monitoring Your Analytics Implementation
Companies who implement analytics monitoring are able to:
1. Build organizational trust in data.
2. Resolve tracking errors before they become a serious problem.
David Goodes, Manager of Marketing Automation Capability and Data at Suncorp, achieved these exact benefits using ObservePoint’s analytics monitoring solution.
Suncorp is one of the top financial services firms in Australia and a leader in digital banking, offering services like personal and business banking, insurance, and pension programs.
Though Suncorp offers services at brick-and-mortar stores, many of their products and services are available via user-friendly, self-service experiences on their website.
Suncorp builds organizational trust in data
David Goodes had a problem.
Goodes is responsible for managing analytics on his company’s robust financial services website. Unfortunately, Goodes’ team often received doubting comments from other teams about the accuracy of the analytics data. He said:
“When you have a team of 100 marketers all looking at reports and maybe not seeing what they expect, they ask questions like, ‘You know, I’m just not sure if that tag’s working. Do you mind rechecking it one more time?’ That’s really distracting to us as an analytics team.”
While many companies talk of democratizing access to data, Goodes’ team took democratization one step further. His team used an analytics monitoring solution to regularly scan their site and output data quality reports for other teams to look at.
Because they deployed the analytics monitoring solution, Goodes was able to build better confidence in data and save his team a lot of hassle from doubting stakeholders.
Suncorp resolves tracking errors before they become a serious problem
A few years ago, Goodes and his team needed to upgrade some of their analytics libraries—creating a fundamental change in their tracking. The team started by using manual debuggers to double down and make sure everything was just right before pushing their changes live. Goodes explained what happened:
“Within five minutes of going live, we received an alert from ObservePoint.”
The alert from ObservePoint notified Goodes that the variable eVar 75 (which captures important anonymous session replay data) didn’t make it into the production environment.
Data not captured in eVar 75 is unrecoverable. Had the team not implemented analytics monitoring, they could have gone weeks without noticing the problem, losing data all the while. Instead, they resolved the error quickly and kept their data intact.
Gain Peace of Mind through Analytics Monitoring
By employing Web Audits and Web Journeys to frequently scan your analytics implementation, you will be able to keep data accurate and build confidence in the data you collect from your live environment.
But what if you could not only monitor your live site, but also test for tracking errors earlier in development, such as in your staging environment? Robust analytics testing solutions like ObservePoint can help you do just that.
If your analytics testing strategy includes both prelaunch testing as well as continuous monitoring, you will have complete confidence that your implementation is tracking exactly what you told it to.
ObservePoint helps companies like you implement analytics testing as part of their regular website release schedule. Request a demo to learn more.
About the AuthorLinkedIn More Content by Matthew Maddox