Digital properties—websites, mobile apps, and more—are central to your business. And customers spend an incredible 6+ hours a day online, providing ample opportunities to collect and analyze data.
Your website and mobile app may have hundreds (or even thousands) of customer touchpoints a day, hundreds of variables and dimensions, dozens of network calls per page, and various APIs that pull the data.
With all these moving parts, how do you ensure that analytics implementations are working properly across the board?
Tip 1: Learn the Tool
Seems obvious, right? But how often do we adopt an operational tool only to learn a few basic points and never take the time to dig any deeper? Or wait until after an installation is completed before getting trained on a new tool?
If you take the time to delve into an analytics tool and really learn how to operate it, your data collection journey—on both the analysis and implementation sides—will be smoother and you’ll be able to plan an implementation that answers real business questions.
While learning to navigate a new tool, there are a variety of questions to consider, such as:
How does the tool’s attribution work?
Multi-channel attribution is a red-hot topic in the analytics industry, but is also one of the most poorly understood, and each tool handles multi-attribution differently.
- Different tools have different default settings—what are yours?
- Different attribution models impact the valuation of marketing channels—what are you most interested in measuring?
- How does the tool handle the following attribution models?
o Last Interaction
o Last Non-Direct Access
o Last AdWords Click
o First Interaction
o Time Decay
o Position Based
- Does it have a machine-led algorithmic solution that lets you apply multiple models to find the best option for each situation?
How can the tool segment traffic?
Think of the hundreds to thousands of data points and the many ways to examine them. You can segment traffic by:
- Visitor type
- Product viewed
- Products purchased
- And so much more
Is your tool capable of making these segmentations? And, to expand, do you know how to set segments up and combine them to get the most effective intersection of segments?
What about duplicates?
How does the tool handle duplicates:
- Page views
If you do not know how the solution handles these occurrences, you might be using inflated or inaccurate data.
Ideas for Mastering Your Analytics Solution
- Check your contract for training opportunities. Training opportunities often go unnoticed and unused but are a great source for accelerated learning. Most MarTech companies include training in their contracts, so be sure to check your contract and take advantage of training opportunities.
- Join analytics groups and attend labs. Learn from industry experts around the world in groups such as the Digital Analytics Association or the Analytics LABS sponsored by Adobe and ObservePoint.
- Consume content. There are hundreds of eBooks, white papers, training videos, articles, and blog posts dedicated to training on both basic and advanced capabilities of various analytics tools. Find sources you trust and consume their content regularly.
- Organize internal training. Senior analysts and architects within your own organization are an excellent resource for focused, customized training that offers great value.
- Set aside some desk hours. Spending even just one hour a week diving deeper into the tool and looking into additional resources can quickly increase your expertise.
- Integrate training into every new implementation. Technology is constantly advancing, and it is crucial to keep your team updated with analytics improvements.
To understand and troubleshoot analytics, you have to know the technology you’re working with and the language it speaks.
Tip 3: Document the Implementation
An analytics implementation can be a maze. You have thousands of pages, most likely on multiple digital properties, and each of them is tagged just a little bit differently.
How do you keep track of all those variables? How are they supposed to be set? When was the last time they were updated?
It can be a huge undertaking, but documentation is necessary.
The Tagging Plan: A Horse of Many Colors
Whether you call it a tagging plan, variable map, implementation plan, or a solution design reference (SDR), this plan is a blueprint of your analytics implementations, specifying when and how each tag should be configured on a website in an easily accessible, living document.
Without this guide, variables may be configured to fire incorrectly and data quality can be distorted. Additionally, it might be impossible to know exactly what data is being captured.
Think of the hundreds to thousands of data points and the many ways to examine them, and make sure to do the following when documenting:
- List every variable—such as props, evars, page names, etc.
- Identify where each variable is collected
- Include rules and exceptions
- Be able to be tested against the live site
- Describe any post collection data manipulation
Below is an example of a tagging plan:
A documented analytics strategy also helps with validating data—giving rules to measure against your live implementations to verify each variable is passing the correct data, and under the correct circumstances. It should also be updated frequently to reflect your current implementations so any developer, analyst, or marketer can clearly understand how the analytics should work.
Hint: Your organization’s senior analytics architect should own and take accountability for the document, but it should be accessible to anyone who has a stake in the analytics operations or the data being collected.
Tip 4: Spot-check Your Implementation (as Appropriate)
Analytics solutions don’t always work as expected, so there will be times (either on the analysis or the implementation side) when you’ll need to verify the analytics solution is tracking properly. At these times, using a tag debugger to spot-check the implementation may be appropriate.
Spot-Checking Tools and Solutions
Also known as “debuggers,” these tools spot-check individual pages to verify that analytics tags exist and are firing correctly. (This is a somewhat limited process, since you have to manually spot-check one page at a time. We’ll show you a more efficient way in the next step.) A few solutions for debugging include:
- The Network Panel in Dev Tools
- Google Analytics Debugger (for Google tech)
- Adobe Debugger (for Adobe tech)
- ObservePoint TagDebugger (for any and all tech)
There are also proxy solutions that capture all network requests coming from your computer and allow you to look through the data being sent. Two of the most popular are:
- Charles Proxy
- Fiddler Proxy
You can learn more about these solutions and how they can lead to real tag governance here.
Tip 5: Implement Automated Data Governance
Things are constantly changing on your website or mobile app—you’ve got all these moving parts and, likely, several different teams adding new technologies and making frequent updates—and if there is so much as just one typo, things can break. Using a tag debugger isn’t going to be enough.
It is essential that you test your implementations before, during, and after each and every change, and then test them again.
You don’t have time to do that with a debugger. So how can you implement repeatable tests over time? Via a data governance solution.
What Is a Data Governance Solution?
A data governance solution looks at any network requests coming off a page or app in any stage of development and checks for tagging errors based on each tag’s structure, determines which tags you have on the page, and whether they are collecting the right data.
The main distinctions between a debugger and a data governance solution are repeatability and scalability. While a tag debugger will allow you, a human, to more easily read requests to determine if tags are firing correctly, it can only help you one page at a time.
With a data governance solution, you can set up scheduled audits that automatically scan hundreds or thousands of pages on your site to discover all tags deployed on your properties (all authorized or unauthorized tags and regardless of vendor) and to record the data that’s being collected by them. By automating this process, a data governance solution validates that you are continuously collecting complete and accurate data.
In addition, you can set up rules within the data governance solution that correspond to the requirements in your tagging plan. If at any point the data in a request doesn’t match what you expect, you’ll receive a notification.
Save Time and Make Better Decisions
Without an automated solution, many companies spend countless hours manually validating each implementation after any changes—and often, that’s just to spot-check major pages, not the entire site. With an automated data governance solution like ObservePoint, teams can significantly reduce the hours spent manually inspecting tags and remain confident that they are capturing accurate data across their properties.
Recruit, an ObservePoint customer, used to have four full-time employees dedicated to testing their tagging implementation. Now they only need one person spending only part of his time on testing because ObservePoint has automated the process.
Bonus Tip: Validate at Key Development Milestones
Don’t wait until your website is out in the wild before checking if it’s collecting data correctly.
Setting protocols to test at each milestone—in development, staging, and after a page goes into production—maximizes the chances of catching any analytics errors that can be fixed as they’re found, ensuring accurate data collection when the site is live.
The Development, Staging, Production Model
1. Development Environment is the initial experimentation environment where you can build the minimum viable product of a website or app.
2. Staging Environment is where properties are prepared to be seen by the public.
3. Production Environment is the live version of your site or app and is completely accessible to the public.
Ideally, you should QA your analytics implementations during development, within the staging environment, and after pushing the site live in order to validate that everything stays in place and that the collected data is complete and clean.
Using ObservePoint’s WebAssurance to QA their analytics before launching a new site, a popular athletic apparel retailer discovered some tags weren’t collecting an important segment of visitor data. If they had not been working through stringent QA processes, they might never have noticed this gap and would have collected highly flawed data on their users. By testing their implementations regularly with ObservePoint, they quickly found and corrected the problem before going live, ensuring their visitor data was complete and accurate.
Automate Data Governance with ObservePoint
You need accurate data in order to anticipate customers’ needs and provide exceptional experiences, and this accuracy depends on implementing bulletproof analytics processes.
See how ObservePoint can help bulletproof your analytics implementation.