Patrick Hillery - 7 Cases Where You Can't Afford To Skip Analytics Testing

October 18, 2018

7 Cases Where You Can't Afford To Skip Analytics Testing

By Patrick Hillery of ObservePoint

Slide 1:

Thank Brian for the introduction. I always like to listening to the introduction, especially the part where it says I attempted to learn the full Adobe stack. I specifically made sure that it said “attempted” so that everyone knew that it was presumptuous to actually know or deploy the full Adobe stack. But I’m excited to be here this morning. Excited to talk about a testing plan. I’ve been at ObservePoint for three years and so this is a lot of what we focus on. A lot of what I focus on. Im very passionate. I’ve done webinars in the past about the documentation in the space of making sure that the data is clean. So I’m excited to jump in. We’ve got a pretty specific webinar, I’m hoping to have a lot of very tangible takeaways. It’ll be about a half hour or so. And then we’ll have time for questions at the end.

Slide 2:

So what we’ll talk about today is what an analytics test plan actually is. And then we’ll talk about how to build one and how to think about one. And then we’ll go through the top seven use cases of when to execute an analytics test plan. And then we’ll just have one slide at the end about how you can automate an analytics test plan.

 

Slide 3:

So, let’s jump in. So what’s an analytics test plan? So essentially, the way I think about it is just documentation of what you want to test and under what scenarios you want to test it. I generally think of four critical areas that need to be involved in an analytics test plan.

 

So obviously, collection in your analytics platform is the last step in a multilayer process that happens with multiple pieces of technology then multiple locations all working together. What a couple of those foundational technologies are your data layer and then your tag management systems which work hand in hand to populate the data that’s going into your data analytics platform. And so, including the data layer, the tag management systems in your analytics test plan is critical.

 

What I’ve seen also is more and more companies, additionally including some of the foundational DOM elements or other Javascript objects that are used by your tag management system or used to populate specific elements in the data layer including the checking of those DOM elements or those additional javascript objects into the actual test plans. So that you’re fully testing every layer of your analytics collection process.

 

Obviously the benefit of doing that is it gives you the opportunity to quickly diagnose if an issue arises in your analytics platform, what the nature of the issue is, right? So if you’re not collecting data or specific data point or a specific variable in your analytics platform, but you have data inside of your data layer that should be then calculated then you’ll know that it’s either a TMS issue of either a rule not executing or potentially the logic of the rule. But you’ll know if the data is missing in the data layer then potentially the rule is performing fine in the tag management system then you have just an issue that the data layer wasn’t populated correctly. And then obviously if it’s missing DOM elements then the DOM elements are used to populate the data layer then you have an issue at that level.

Slide 4:

So let’s jump in to how to think about what to test. So, often times, you’ll want to think about the most important thing that you’ll want to test is what is directly impacted by the release. And that’s kind of the bare minimum that likely what you’re doing and what you’re doing right now some cases manually, some cases with the tool to automate the process. But additionally, the best practice is to include any other scenario that potentially inadvertently could affect the data quality through the release.

 

And that obviously broadens the scope and broadens the quantity of testing that you need to do under those scenarios. Also, you want to think about the technologies that are used to drive the functionality of your web and mobile applications. Often times, you have separate development teams that align with each of those applications and so likely, scope can be limiting and can be kind of sent to each of those technologies.

 

So now that we have a good understanding of what to test, let’s think about, just go through a few example scenarios of what to test in a specific scenario.

Slide 5:

 

So this example, let’s say that we’re a retailer and that we’re updating our content management system templates. And let’s say we’re not updating all of them, let’s just say we’re updating a couple of them. So in this situation, the bare minimum what you would need to definitely include in your analytics test plans are the page load and the click actions that are tracked in your analytics on the templates that are being updated.

 

Now likely what you should test and what I would recommend as a best practice is that you should test all of the templates that are managed through your content management system. It’s likely that there are some shared resources that are being updated in those few templates and Javascript files. Maybe some HTMLs that’s globally inserted into all templates that could be potentially, although maybe unlikely, could potentially impact the rest of your analytics collection on all of the other templates in your content management system.

 

So, I’ve also included on the right a screenshot of what an analytics testing plan could potentially look like. So you could have each of the scenarios identified as columns. In this case, each of these scenarios could  be an individual template or a group of templates and then whether or not it’s being just loaded or whether or not there’s specific actions being performed on each of the templates.

 

And then as loads, you’ll have each of the data points that you’ll want to validate inside each of the layers. In this case, we have the data layer, we have specific data layer variables that we’re checking and maybe not all data layer variables apply to each scenario and that’s fine. We can just include each of the ones that are relevant. And then inside of the tag management solution, you want to validate that each of that page load or an event based rule is being executed at the right time under each of those scenarios. And then obviously, validating the analytics variables that are being captured.

 

So when you have an outline like this, you can very easily go through and mark each of these checks as passing as you go through a specific execution of an analytics test plan for a given scenario.

Slide 6:

 

Another example of maybe how to think about what to include in an analytics test plan or how to build one is let’s say in this case, you’re a travel site and you are updating the booking engine. So you’re updating the flow by which the user will come on your site and book travel through your web or mobile application. So in this case, the must test is you must test all of the steps that are actually being changed as part of this scenario. What I would recommend and what you should test and what would be a best practice is to test the full booking flow from beginning to end each time any step is impacted or changed. Obviously if you’re a travel company, the booking engine is the lifeblood of your business and lifeblood of your analytics and what data you’re reporting. So any issue inadvertently added to the booking flow would obviously significantly impact your data collection.

 

I was at a travel agency consulting several years ago, and after a particular release, there was about a 15% drop in revenue that happened, and it was a very unique issue and it took us a couple weeks to even identify the issue and a few more weeks after that to get the issue fully resolved. In that situation, in addition to going through the full booking flow, what we missed and what we didn’t include in our test plan was including a full booking flow in each of the critical scenarios, each of the personas that a user could use to actually complete the booking flow. This specific scenario that had an analytics issue that wasn’t tracking revenue was a situation where an end user could book travel using membership rewards in addition to cash, and it was that cash, that revenue that wasn’t being tracked, that took us a while to figure out. It was because our test plan wasn’t complete, and we didn’t test all of those critical steps.

 

Slide 7:

 

So, now to think about how to build a test plan. How do you know what variables you need to include because obviously being comprehensive would be really time consuming and probably not worthwhile. And you’ll, like as with most things in your business and maybe in your personal life, you’ll want to follow the 80/20 rule where there’s likely a small subset of data that’s being collected that’s providing 80% of the value for the reporting and operation of your business.

 

So, we’ll talk about number two, went to execute the tests, when we go through the seven steps. But when we think about what test to execute, there’s really two different ways to frame it. But essentially the answers are the same and really there are very similar ways of positioning things.

 

Slide 8:

 

So one thing you want to think about is what variables in my analytics platform, if we’ve lost collection on, would significantly impact our operational reporting and how we’re making key decisions on where to invest our marketing dollars and where to make improvements to our product? Those are typically your KPIs and maybe a handful of variables that are really the foundation of most of the reporting that you’re doing.

 

Slide 9:

 

And certainly, you’ll want to also think about the key end user experiences that a customer’s visiting your application to do. So what are they, when they open up your mobile application, what are the key goals that they’re trying to perform? And obviously, thinking about things that way, flow into identifying your KPIs and what are the best things to track? So really, as I said, those are basically two ways of answering the exact same question.

 

So hopefully, you have a good understanding of what a test plan is. Potentially even how to build one from a high level.

 

Slide 10:

 

Let’s talk about now when to execute these tests. These are the top seven use cases. So this is the one slide that has everything aggregated. We’re going to go through each one of these individually, but if you want to take a screenshot, now is the time to do it. But as Brian said, the full presentation will be available after as well.

 

Slide 11:

 

I tried to put these somewhat in order of importance. I think a lot of these could be debated, but hopefully, just generally, they are kind of from most to least important.

 

So, number one, I think is pretty obvious and hopefully today you do at least some testing before deploying a tag management system change. So, every time you’re deploying a tag management system change, a test plan needs to be executed. You need to be really really confident or maybe a risk taker to be able to make a deployment without testing in your tag management system. Either that, or you likely already have an existing issue that is affecting customer experiences, and you need to get that out as quickly as possible. And You're willing to risk that this won’t make things worse.

 

So, as I’ve just had conversations with folks over the years, it seems like everybody that works in developing tag management system logic has a story of how they’ve pushed that bug into production that was a really stressful embarrassing situation. So obviously, hopefully, if you’re a tag management developer and you haven’t had that experience yet, hopefully by executing a test plan before making a change, you can avoid being added to that list.

 

Obviously, there’s a lot of different changes that you can make inside of a tag management system, but under almost any of these circumstances whether you’re just updated a logic inside of a specific rule, whether you’re adding a new rule, you’re adding logic to when that rule should execute, or you’re adding a brand new vendor, or you’re moving a vendor, under each of those scenarios there can be additional conflicts that happen between some of the other tags. Maybe on a certain template that all need to be tested.

 

Slide 12:

 

So the next one is application updates. So these are when your development teams are making changes to your content management system, to your ecommerce platform, to your booking engine, to your account management platform. Each of these likely has a release schedule, has functionality in some cases that shouldn’t impact analytics, but sometimes does or sometimes new features and additional analytics that happens with that release.

 

But in each of these situations this is again one of the most critical scenarios where your analytics data could potentially have data collection issues. This is where having a good test plan to execute and to be efficient with is critical.

 

Slide 13:

 

So the next scenario is deploying new content. So these are the situations where you’re either just launching new pages via your content management system and often times there are key things that need to be authored and need to be populated by the content team that is populating and filling out all the fields naming specific offers or the impression of items in their specific format.

 

So it’s critical to be able to validate that those fields have been populated correctly before you’re pushing out new content to your management system. Also, when things are being launched by an agency potentially developed microsite. So microsites are oftentimes outliers in that they don’t necessarily follow the standards that are being used by your internal development teams. And oftentimes, need specific use cases handled that are unique to that microsite. Microsites are one of the more common situations where data collection can potentially have issues.

 

Similar to microsites, landing pages as well are another common area where data collection issues can be introduced.

 

Slide 14:

 

So these next couple are switching from more IT development types of processes to marketing driven processes. In this case, when you’re sending an email campaign, this is an opportunity to potentially not collect all of the data that you need to validate whether or not that email campaign is performing well or performing better or worse than other email campaigns.

 

So, in those situations, you’ll want to be able to validate that each of the URLs inside of that email are landing on URLs that have, that are correctly redirecting and that data is being captured through that redirect. Then similarly, when the target page actually loads, that the right technologies on that page are actually there to be able to collect the data. That you have a well formed marketing code that follows your guidelines. So this is obviously, there’s a lot of moving parts that could potentially cause some data issues with your email campaign.

 

Slide 15:

 

The next scenario is a paid marketing campaign. So this is when you are purchasing traffic either via paid search, video, display, any of the different kind of paid marketing avenues that are available to you. And under each of those situations, you’re paying to acquire traffic to your mobile or web application. And it’s critical that you have analytics in place so you can measure the return on the ad spend for those initiatives.

 

And similarly, the email, you want to make sure the links that you’re directing people to as they click through each of these marketing efforts, that there’s the correct tags, your analytics tags, your data layer, your tag management system is present on those pages that are being loaded as you click through each of those situations.

 

Slide 16:

 

So in this next one, updating your data layer. This oftentimes happens in parallel with updating your application, but sometimes the data layer logic or the data layer itself can be updated independently. And even something as simple as changing the capitalization of one letter of the values in your data layer can have fairly significant impact to your data collection. Understanding that oftentimes tag management roles pivot off of being able to correctly identify specific values. And obviously, those situations can really impact your data collection in your analytics platform.

 

Slide 17:

 

And then lastly, when you’re launching a test. So this could be an AB test, a targeting test, a segment based test. So oftentimes, companies are integrating their testing into their analytics platform so that in addition to just doing the key performance indicator analysis on which test is performing better, they can also do longer term analysis in their analytics platform as to really how what were the long term impacts of a test.

 

So each time you launch a test, there’s an opportunity that that data collection doesn’t happen in your data analytics platform. You want to make sure that the launching of that AB test through your optimization platform doesn’t impact your analytics.  

 

Slide 18:

 

So those are the seven use cases. I think you’ll, a common thread throughout those seven use cases is that it’s key to have a process in place and to have an organization that’s all bought in to maintaining high data quality. In those conversations sometimes haven’t happened and oftentimes haven’t happened in all of the organizations that we speak with. That there isn’t a clear identification of, “Here’s where the analytics test plan is, and here’s this is how to execute the analytics test plan in each of these situations.”

 

So really there’s a big governance aspect to being able to execute these tests at the right time, in the right places because all of these processes are likely happening outside of your analytics team. And therefore, you need coordination. You need a process in place. You need, in some cases, tools, project management, additional people that own that specific step in that process to make sure that the right folks are being included at the right time as those processes are happening.

 

In some cases, some of you might be thinking, well I’m not really involved and the QA isn’t happening today. Half of those processes are more potentially. And you’re not alone. That’s not an uncommon situation. This is obviously something that if you don’t necessarily have a good plan in place, can be a burden. And can be a significant cost with hiring digital resources to execute these test plans at the right time, kind of each of the times these activities are happening.

 

Slide 19:

 

So that leads us into automating a test plan. So obviously, if you were to go through and say, “We’re going to hire folks to execute these test plans each time in each of these scenarios.” We’re talking about a fairly large resource ask for your upper management to be able stack that accordingly. That obviously would likely meet opposition or at the very least, it would be a lengthy business process to prove and identify the value for your organization to make those investments and those resources.

 

So obviously if there’s an opportunity to lessen the need for the investment in resources from a human standpoint, you can have, if you have opportunities to automate the testing plan, you can likely use a fraction of the resources that you might need to invest in optimizing and running the test plan manually.

 

And obviously this is where ObservePoint would come in and provide significant amount of value, and there’s other opportunities for you to make some small improvements in how to automate things outside of making an investment. But ultimately, there are tools out there that can automatically track each of the layers of your data collection process.

 

Slide 20:

 

So that’s all the content that I have. Hopefully you have a better understanding of each of the scenarios where you’ll need to have a process in place and have good governance in place.

 

We’re going to open things up to questions now. Happy to stay on for a few minutes and answer some questions. If not, happy to have a conversation afterwards. You can always email me, patrick.hillary@69.164.203.52. Loved to have a conversation. 

Previous Flipbook
Why You Should Audit Your Web Tags
Why You Should Audit Your Web Tags

Why You Should Audit Your Web Tags outlines key considerations surrounding broken tags and how to avoid them.

Next Item
Analytics Monitoring: How to Ensure Your Implementation Stays Up and Running
Analytics Monitoring: How to Ensure Your Implementation Stays Up and Running

This eBook walks through how an analytics monitoring solution can help you periodically scan the pages on y...