If you’ve ever tried to use a Phillips-head screwdriver to sink a flat-head screw because that’s all you had, you probably got pretty frustrated, and after trying much longer than you normally would to sink it, odds are you didn’t actually finish the job with that tool.
The same goes with digital analytics validation—using the wrong tool leads to frustration and means you won’t be able to truly validate your data.
The previous two blog posts in this series discussed the costs of using a “free,” secondary analytics tool to validate your primary solution and the problems with trying to reconcile data from two very different tools.
This post is all about using the right tool to validate your mobile app and web analytics data and what clues in your reports to look for to give you the assurance that your primary analytics tool is collecting the data you expect.
Analytics tools are not meant for validation
An analytics tool is not the same as a validation tool.
Analytics tools were never envisioned to be backup solutions to detect data discrepancies and analytics errors.
No matter what you think of their strengths or weaknesses, these tools are bona fide analytics platforms—not validation solutions.
To illustrate, let’s take a look and see how Google Analytics would perform as a real data validation tool: configure Google Analytics to report if the Adobe Analytics tool has collected the correct page names from your website.
What’s more, it can’t even tell you if the Page Name variable is being passed correctly. It only tells you the page name Google Analytics collected, and that’s not even the question you want answered.
Again, it’s not Google Analytics’ fault; it’s our fault as developers and digital analysts for making it commonplace to use the wrong tool for the wrong job.
For years many organizations have been expecting Google Analytics or another analytics tool to validate their Adobe Analytics in this way, when, curiously, these tools can’t actually tell us a single thing about Adobe implementations.
We’ll use the same tools as examples to illustrate further.
Ask your Google Analytics to do these simple validation tasks:
- Tell you if the Page Name variable was passed to Adobe
- Detect if a page is not executing an Adobe tag
- Show you where Adobe tags might be inflating data
- Discover if the product string is set correctly on the product details page
- … how about the billing page?
- … or the confirmation page?
- Report if the right data is being passed to Adobe throughout the conversion funnels
- Alert you immediately if a critical process fails
The list goes on and on, with numerous standard things a validation tool should be able to do but a secondary analytics tool can’t.
The bottom line is that you cannot do actual validation by resorting to a free tool that was never made to perform such validation.
Even if you do attempt to do so, it can take Herculean effort just to reconcile the analytics data your secondary solution collects with that of your primary solution in any meaningful way.
The beauty of real validation tools is that you don’t have to put any new code on your web or mobile properties, or spend valuable time and resources reconciling conflicting reports.
Validation technologies, like ObservePoint, can be set to automatically scan your digital properties, render the pages in a browser like an actual user, and report what data your primary tool is actually capturing and how accurately it is doing so.
Below is an example of an ObservePoint validation report from a website that ranks in the top 2,000 most popular websites in the world (according to ranking.com).
And, no, I’m not going to tell you what website it is, but take my word for it, these results are not unusual.
Here are a few quick take-aways from a small, 100 page audit:
- 454 duplicate tags (indicate inflated data)
- 6 failed business compliance rules (meaning 6% of pages aren’t collecting data as expected, such as a page name)
- Only 84% of pages return a 200 status code (leaving 16% of pages likely suffering from excessive redirects, negatively impacting SEO scores, or suffering from 404 codes, meaning your visitors can’t find content)
- The site has two analytics solutions on it, but neither of them are reporting any of these issue.
This audit took about 30 minutes from set-up to completion and reveals several key issues that need to be addressed immediately to promote your data to a level of trustworthy quality. It doesn’t end there; each of these metrics has a full report with additional context and a list of the pages where the problems are.
As an analyst, this is exciting to me—being able to see exactly what is happening with your analytics and validating your decision-making data.
And just think, there is no code to deploy, no reports from competing technologies to reconcile, and no doubt that you’re validating what your primary analytics solution is actually collecting. This is a completely independent audit that allows me to simply click into the summary metrics and dig into the full reports.
How could a secondary analytics tool even begin to tell you this?
Even if this website had been perfectly implemented, how could they prove it? No matter how good your analytics data is, without independent verification, you won’t have the evidence to give you confidence in the numbers.
If you’re ready to stop using a faulty backup system to validate your analytics, start a free web analytics audit with ObservePoint’s validation tools and be ready to be surprised by what it discovers and how easily it uncovers data problems that your secondary tool never could tell you were there.
Check out Part 1 of this three-part blog series.
About the AuthorLinkedIn More Content by Matthew Maddox