The Biggest Web Analytics Mistake You Didn’t Know You Were Making (Part 2 of 3)

August 4, 2016 Matthew Maddox

One of the biggest potential mistakes in web analytics that you might not be aware you’re making is using a second analytics tool to validate the first one.

I made this argument in the first post of this series, but feel so strongly about it, that it warrants further discussion.

Maybe you’re using a free solution to validate the results of your paid one because you’ve heard it’s a good analytics practice.

However, the groupthink of “everybody is doing it” puts you at a competitive disadvantage, especially in the fast-paced world of digital marketing where the ability to make strategic decisions on accurate data is critical to your success.

The last post discussed the costs of using a “free” tool to validate your web analytics—and how it is surprisingly expensive to use a free tool to validate that your paid tool is collecting data as it should.

In addition to the hidden costs, there are other potential problems with using a secondary solution to validate your primary one—making true comparisons can be nearly impossible and when discrepancies occur, you might not be able to tell which tool is providing the accurate data.

You can’t make a true comparison

The prevailing reasoning for adding a secondary web analytics technology goes: “If I use the free tool to double-check the analytics, I can rest assured that my paid tool is working correctly.”

Just think about the logic behind that for a minute.

If the tools are delivering the same data, why wouldn’t you just go with the free one and save your organization a bundle? Take the team to Hawaii as a bonus for all the money you saved on your primary analytics vendor?

Now, why can’t I hear the sound of any jets whooshing off to the tropics?

Oh yes, it’s because everybody knows that no tool is the same, and no matter how hard you try, you won’t get the same results.

Measuring, algorithms and processing are all different within different web analytics solutions.

No surprise here.

The differences between tools gives each product its competitive advantage, which is the reason you selected and/or purchased it in the first place.

Just for a test, I looked at my training website which has both Adobe and Google tracking on it for teaching purposes.

Here are the numbers for one week on this site:

Metric      Google Analytics        Adobe Analytics
Page Views      25,791        25,663
Visits       10,313        9,734
Visitors      10,286        2,670

Page Views are pretty close, I could maybe live with that—even though it might kill me to suppress the urge to reconcile the numbers.

Visits are off by 6%. Hmmm, not liking that.

And, yikes! Look at Visitors.

If that’s how different the basic metrics are, what would happen with more complex data, like attribution models? Plus, any metric that uses visitors for the denominator is going to propagate the discrepancies.

Granted, I haven’t tried to reconcile these two tools so that they collect the same data. But why would I even want to?

Who has time for that when there are automated auditing solutions made to validate analytics data? And, while such solutions may not be free, they don’t require any wasted implementation and maintenance costs.

Which tool is right and which is wrong?

Looking at the chart above, the obvious question comes to mind—which tool is right and which is wrong?

Now you have to try to reconcile the data and justify any differences to your superiors. Kind of an intimidating task, don’t you think?

You could go back in history to estimate which tool is closer to reality—but if that’s your alternative, then why have the secondary tool at all? Looking at history to determine what the present should be is little better than guessing. Surely we can all see the problem with that.

The real issue though, is that, even if your tools were collecting the same data, discrepancies between the two tools will always arise, and when they do, you have to spend unknown amount of time and resources troubleshooting both of them.

At the end of the day, you may not even have a high degree of confidence in either solution.

Discover easy comparison reports and analytics accuracy.

It all comes down to the fact that two different solutions are going to tell you two different stories. It’s not bad for them to be different, whether paid or free, it’s just that the way things are created and configured makes it impossible to validate with one another.

Using a secondary tool to validate the primary one can cause more complications and can still leave you with unvalidated data that your organization might be using to make strategic business decisions.

The solution lies in a tool that was created to validate analytics solutions—automated auditing technology like ObservePoint’s WebAssurance™ and AppAssurance™. These technologies independently assure you that your analytics data is working as you expect. More on that in post 3.

Run a free web analytics audit and find out for yourself what ObservePoint can discover. Setup takes just a minute, there is no code to deploy, and you may be surprised by what you learn.

Check out Part 3  of this three-part blog series.


About the Author

Matthew Maddox

Matt’s mission as the Director of Enablement at ObservePoint is to educate customers to use the marketing technologies they select for their sites most effectively. Matt delivered training at Omniture and Adobe for over eight years before joining ObservePoint. He was the dedicated trainer for several global companies, creating and delivering custom courses based on their corporate business requirements. With a wealth of experience solving analytics questions in many industry verticals, including e-commerce, media, finance, lead generation and automotive, Matt offers sound direction and analytics insight.

LinkedIn More Content by Matthew Maddox
Previous Article
The Biggest Web Analytics Mistakes You Didn’t Know You Were Making (Part 3)
The Biggest Web Analytics Mistakes You Didn’t Know You Were Making (Part 3)

This article addresses the common mistake of trying to use a secondary analytics tool to try and validate a...

Next Article
Ensuring Internal Data Quality During an Audit with Adobe Analytics
Ensuring Internal Data Quality During an Audit with Adobe Analytics

Data is defined as “facts and statistics collected together for reference or analysis.” But what happens if...

Get a free 14-day trial with ObservePoint

Start Your Trial