Adam Greco, Search Discovery – Reasons Analytics Implementations Fail (and How to Avoid Them)

January 16, 2020

Organizations invest a lot of time and resources in digital analytics since this is an area critical to success. Unfortunately, many organizations struggle with their digital analytics program and implementation.

In this session, Adam Greco, Senior Director of Technology Solutions at Search Discovery, will share:

  • What he has seen companies struggle with in web analytics
  • Strategies to avoid these issues

Fill out the short form to view the presentation on-demand.

Hi everyone, and welcome to my session at the Virtual Analytics Summit by ObservePoint. Today I'm going to talk about reasons why digital analytics implementations fail, and I'm going to talk about how you can avoid some of these problems. So before we get started, let me just introduce myself. For those of you who I haven't met, my name is Adam Greco, and I have been in the analytics industry for about 15 years, a little more than that. I was one of the early employees at Omniture and one of their early customers. So I kind of got in on the ground floor of one of the leading analytics technologies. After I was a customer, I joined the Omniture consulting group and helped build that practice in the U.S. and over in Europe. I also have managed the analytics implementation at both Chicago Mercantile Exchange and Salesforce.com. I've also implemented hundreds of analytics programs and implementations across many different verticals over the past 15 years.

And some of you who live in the Adobe world probably have read one of my hundreds of blog posts or taken a peak at the book that I wrote on Adobe Analytics. So, I've got a lot of experience and seen lots of different implementations, and that's why I really wanted to focus on some of the reasons why I see people fail and how you can avoid that when it comes to digital analytics implementations. We only have about 30 minutes, so I'm going to just cover a couple of them. This is a topic I could talk about for hours and hours, but let's dive in. So digital analytics implementations can provide amazing value. Unfortunately, they don't really come cheap. A lot of people don't realize how expensive it actually is to run a whole program on digital analytics, but if you think about it, a long time ago, your organization decided that they should invest in digital analytics.

It was probably prompted by a tool selection of using either an Adobe Analytics or a Google Analytics, but a lot of people don't realize is how much the all-in costs [inaudible] of digital analytics. And so why do people even invest in digital analytics to begin with? Why did your company way back when spend the money to build your team and purchase tools? Well, in many cases it's because they wanted to increase the ROI of their ad spend. They wanted to figure out, of all the times we're spending money to drive people to the website, which things are working and which aren't working, how can we figure out the campaigns that are really effective and the ones that we're just wasting money on? And way back in the early days of digital analytics, this was one of the key reasons why people invested in the technology. As digital analytics became more mature, people realize that they could hire analysts and they could really dig into this data and try to increase their revenue by optimizing either their website or their mobile app.

And then when they got more sophisticated, they would also dig into personalization. And say, how could we use the data that we have on people to really target them with the right messages at the right time, try to get them to return to the website. And then in probably the last five to seven years, digital analytics has really moved more into the area of experience where companies like Google and Adobe and even Salesforce.com have really focused on saying, you know, one of the reasons why you have a website or a mobile app is to delight your users or your customers or your prospects to try to get them to come back to your website and buy more, or to just have a better online experience, which might then translate, to an offline success. So whether it's trying to figure out your advertising spend or to just use analysis and data to optimize your website or your app and make it better over time in terms of being a revenue generator or a lead generator, or if you're really focused on just having people who love coming to your website or something like NPS scores.

There's lots of different reasons why people think about using digital analytics. But all of that is contingent on you having a really good or an amazing analytics implementation. And that's where sometimes people fall down. So when you think about the true cost of digital analytics, it goes way deeper then just the hard costs that you're spending on the software. A lot of times people think of their investment in either Adobe or Google Analytics or ancillary tools like ObservePoint, our sponsor here, or Decibel Insight, you name it, all of these different tools that compliment a tool like Adobe or Google Analytics. That's kind of your hard costs. Well, what a lot of people don't think about is the soft costs involved in terms of your implementation. It's pretty expensive to implement analytics tools and that initial implementation can cost a lot, or you have to reimplement over and over again if you don't do it the right way.

And there's also ongoing costs involved with analytics, in terms of keeping your implementation current and focusing time on data quality to make sure that the data in there is right. That doesn't even talk about the third point, which is the whole cost of doing the analysis. You have a whole team of people that you want to invest in and hire and bring your organization, and those people are not cheap. This field that we're in tends to be a pretty expensive one. Hiring talent is something that oftentimes can far outweigh the cost of your products that you're paying, but you want to make sure that you get really good people, because you don't actually get any return on investment in digital analytics until you do your implementation, and then you have your analyst go in and do the analysis and then make changes on your website or your mobile app based on that analysis.

Up until that point, it's just a cost to you. So, why do organizations tend to fail at digital analytics? And I hate to be kind of a cup half-empty person here, but the truth is there's a lot of organizations that struggle with digital analytics. I learned this the hard way because when I worked at Omniture, one of the last roles that I had was being the, what was called the wolf of Omniture, like the movie Pulp Fiction. Whenever an Omniture customer back in the day was having a bad experience or they thought they wanted to cancel their contract, I had to go in and clean up that account just like Harvey Keitel had to do in Pulp Fiction. So I used to go to company after company, and I would see all of the mistakes that they were making and the reasons why they weren't being successful with digital analytics.

So, because of this, I've probably seen more screwed up digital analytics implementations than anyone on the planet, which is why I think I'm a good vessel here to share some of the tips I've learned of how to avoid having to see the Wolf one day at your organization. So I'm going to go through kind of my top three tips to avoid seeing the Wolf. First one is what I'm going to call lack of business requirements. So I happen to be a big believer in business requirements. Unfortunately, when I work with organizations, the first question I ask them is, "hey, I know you're having struggles with your implementation. Can I see the list of business requirements or questions that is driving your implementation." And unfortunately, nine times out of 10, I don't get anything in return. They don't have something. So why is that?

Over the years I've seen that the main three reasons are that number one, the people who I'm talking to have inherited an analytics implementation. They weren't the ones who actually set it up. That happened many years ago, and the people who set it up are long gone. So they generally are stuck with something that they don't know why someone did something. They're just kind of working with what they have. The second thing I see is when I ask people for business requirements or business questions, they tend to produce what's called a solution design reference or an SDR, and those of you on this webinar are probably familiar with that. It's basically just a document that tells you all of your data points that you have. If you're an Adobe customer, it might be a list of all of your success events, eVars, or S props.

If you're a Google customer, it might just be your goals are your dimensions, but basically an SDR is just a document that tells you what has been tagged. It doesn't really go that deep into telling you why it was tagged. And the other problem I found is even if you just having an SDR, the odds are that your SDR is way outdated. And when I work with companies, and I audit their implementations, I see that what they gave me in their Excel spreadsheet SDR is very different than what they actually have in production. And then third, I see that many organizations take what I call a fire, shoot, aim approach or a tagging first approach to digital analytics implementations. Instead of taking the time upfront to think about what are the questions we want to answer? They basically start by saying, what data do we have on our pages and let's just collect all the data that we think might be interesting to people down the road, and then we'll tag that and then later leave it to the poor analyst to determine what are they going to do with that. 

Hi everyone, and welcome to my session at the Virtual Analytics Summit by ObservePoint. Today I'm going to talk about reasons why digital analytics implementations fail, and I'm going to talk about how you can avoid some of these problems. So before we get started, let me just introduce myself. For those of you who I haven't met, my name is Adam Greco, and I have been in the analytics industry for about 15 years, a little more than that. I was one of the early employees at Omniture and one of their early customers. So I kind of got in on the ground floor of one of the leading analytics technologies. After I was a customer, I joined the Omniture consulting group and helped build that practice in the U.S. and over in Europe. I also have managed the analytics implementation at both Chicago Mercantile Exchange and Salesforce.com. I've also implemented hundreds of analytics programs and implementations across many different verticals over the past 15 years.

And some of you who live in the Adobe world probably have read one of my hundreds of blog posts or taken a peak at the book that I wrote on Adobe Analytics. So, I've got a lot of experience and seen lots of different implementations, and that's why I really wanted to focus on some of the reasons why I see people fail and how you can avoid that when it comes to digital analytics implementations. We only have about 30 minutes, so I'm going to just cover a couple of them. This is a topic I could talk about for hours and hours, but let's dive in. So digital analytics implementations can provide amazing value. Unfortunately, they don't really come cheap. A lot of people don't realize how expensive it actually is to run a whole program on digital analytics, but if you think about it, a long time ago, your organization decided that they should invest in digital analytics.

It was probably prompted by a tool selection of using either an Adobe Analytics or a Google Analytics, but a lot of people don't realize is how much the all-in costs [inaudible] of digital analytics. And so why do people even invest in digital analytics to begin with? Why did your company way back when spend the money to build your team and purchase tools? Well, in many cases it's because they wanted to increase the ROI of their ad spend. They wanted to figure out, of all the times we're spending money to drive people to the website, which things are working and which aren't working, how can we figure out the campaigns that are really effective and the ones that we're just wasting money on? And way back in the early days of digital analytics, this was one of the key reasons why people invested in the technology. As digital analytics became more mature, people realize that they could hire analysts and they could really dig into this data and try to increase their revenue by optimizing either their website or their mobile app.

And then when they got more sophisticated, they would also dig into personalization. And say, how could we use the data that we have on people to really target them with the right messages at the right time, try to get them to return to the website. And then in probably the last five to seven years, digital analytics has really moved more into the area of experience where companies like Google and Adobe and even Salesforce.com have really focused on saying, you know, one of the reasons why you have a website or a mobile app is to delight your users or your customers or your prospects to try to get them to come back to your website and buy more, or to just have a better online experience, which might then translate, to an offline success. So whether it's trying to figure out your advertising spend or to just use analysis and data to optimize your website or your app and make it better over time in terms of being a revenue generator or a lead generator, or if you're really focused on just having people who love coming to your website or something like NPS scores.

There's lots of different reasons why people think about using digital analytics. But all of that is contingent on you having a really good or an amazing analytics implementation. And that's where sometimes people fall down. So when you think about the true cost of digital analytics, it goes way deeper then just the hard costs that you're spending on the software. A lot of times people think of their investment in either Adobe or Google Analytics or ancillary tools like ObservePoint, our sponsor here, or Decibel Insight, you name it, all of these different tools that compliment a tool like Adobe or Google Analytics. That's kind of your hard costs. Well, what a lot of people don't think about is the soft costs involved in terms of your implementation. It's pretty expensive to implement analytics tools and that initial implementation can cost a lot, or you have to reimplement over and over again if you don't do it the right way.

And there's also ongoing costs involved with analytics, in terms of keeping your implementation current and focusing time on data quality to make sure that the data in there is right. That doesn't even talk about the third point, which is the whole cost of doing the analysis. You have a whole team of people that you want to invest in and hire and bring your organization, and those people are not cheap. This field that we're in tends to be a pretty expensive one. Hiring talent is something that oftentimes can far outweigh the cost of your products that you're paying, but you want to make sure that you get really good people, because you don't actually get any return on investment in digital analytics until you do your implementation, and then you have your analyst go in and do the analysis and then make changes on your website or your mobile app based on that analysis.

Up until that point, it's just a cost to you. So, why do organizations tend to fail at digital analytics? And I hate to be kind of a cup half-empty person here, but the truth is there's a lot of organizations that struggle with digital analytics. I learned this the hard way because when I worked at Omniture, one of the last roles that I had was being the, what was called the wolf of Omniture, like the movie Pulp Fiction. Whenever an Omniture customer back in the day was having a bad experience or they thought they wanted to cancel their contract, I had to go in and clean up that account just like Harvey Keitel had to do in Pulp Fiction. So I used to go to company after company, and I would see all of the mistakes that they were making and the reasons why they weren't being successful with digital analytics.

So, because of this, I've probably seen more screwed up digital analytics implementations than anyone on the planet, which is why I think I'm a good vessel here to share some of the tips I've learned of how to avoid having to see the Wolf one day at your organization. So I'm going to go through kind of my top three tips to avoid seeing the Wolf. First one is what I'm going to call lack of business requirements. So I happen to be a big believer in business requirements. Unfortunately, when I work with organizations, the first question I ask them is, "hey, I know you're having struggles with your implementation. Can I see the list of business requirements or questions that is driving your implementation." And unfortunately, nine times out of 10, I don't get anything in return. They don't have something. So why is that?

Over the years I've seen that the main three reasons are that number one, the people who I'm talking to have inherited an analytics implementation. They weren't the ones who actually set it up. That happened many years ago, and the people who set it up are long gone. So they generally are stuck with something that they don't know why someone did something. They're just kind of working with what they have. The second thing I see is when I ask people for business requirements or business questions, they tend to produce what's called a solution design reference or an SDR, and those of you on this webinar are probably familiar with that. It's basically just a document that tells you all of your data points that you have. If you're an Adobe customer, it might be a list of all of your success events, eVars, or S props.

If you're a Google customer, it might just be your goals are your dimensions, but basically an SDR is just a document that tells you what has been tagged. It doesn't really go that deep into telling you why it was tagged. And the other problem I found is even if you just have an SDR, the odds are that your SDR is way outdated. And when I work with companies, and I audit their implementations, I see that what they gave me in their Excel spreadsheet SDR is very different than what they actually have in production. And then third, I see that many organizations take what I call a fire, shoot, aim approach or a tagging first approach to digital analytics implementations. Instead of taking the time upfront to think about what are the questions we want to answer? They basically start by saying, what data do we have on our pages and let's just collect all the data that we think might be interesting to people down the road, and then we'll tag that and then later leave it to the poor analyst to determine what are they going to do with that. 

And that's kind of the reverse of what I like to do, because this is just a lot of work. It's a lot of tagging, and it's very inefficient, and it puts a lot of the onus on your analysts to say, well, here's the data that I've been given. Now let's see if I can figure out some needles in the haystack here. So, why do I like to do business requirements before I implement? I feel like there's a lot of great benefits of doing this. First one is that it gives you a great venue to align the business needs of your organization with the technical implementation that's going to be tagged. It gives you a way that you can work with your stakeholders and get buy in from them before you do any lines of code.

You can basically have a common vernacular with your business stakeholders, and it's kind of a bridge between your technical team and your business team. If you can get all of them to agree on a set of business requirements or questions that they want to be answered, and then you can prioritize that, one of the benefits is that you can then flow out your development cycles and figure out which questions do we want to answer right away, which ones do we want maybe down the road, and which ones we're not even sure if we want, we'll keep them on the list, but they're not high priority. So by having that prioritization process, you can drive your development cycles and only implement the stuff you need right away. Another thing that's great about this approach is it makes sure that everything that you're tagging is tied to a business purpose.

 

Speaker 1: (11:45)

You're never tagging just for tagging sake. You can tie every single eVar or event or S prop or goal or dimension to a particular business question or business requirement, which makes sure that you're not wasting any of your valuable technical resources. Once you're done implementing, what's really cool about having requirements is that you can actually map your analytics reports and dashboards to those business requirements. You can make sure that if you want to show someone a dashboard, you can say, hey, remember a couple of months ago you said you had this business question. Well, here's the actual analysis workspace dashboard that answers that particular question. You can also use business requirements to train end users, and I'm going to come back to this a little bit later. Instead of training them on the tool, you can actually train them on the business questions that they have, which is a much better way to kind of get into their heads and get them motivated to learn how to use analytics tools.

 

Speaker 1: (12:45)

So, I've been talking a lot about business requirements, but what do they actually look like? What do I mean when I say business requirements or business questions? This is not rocket science, and I'm showing an example here of a fictitious client, but all I'm really looking for is for you to work with your organization and your stakeholders to say, what are the business questions that we want to answer? In this case, I basically have listed out a couple of them, like which external campaign codes are driving traffic and leading the conversion. How often are visitors costing us money, because they're engaging in high costs chat after they're doing onsite search, and these are just general questions that you can come up with in your organization. You could see that I've taken each of these business requirements, put them into categories, I can then prioritize them with my stakeholders, and then I can plan them out into different phases.

The other thing that's important is I've tied each business requirement to a particular person. I figured out, what is the group that wants it, and who's the person in the group who wants this the most, so that I can know if I have questions when it comes to implementation, who I can go back to, or once it's implemented, who are the first people that I want to train on these business requirements? Well, obviously I want to go back to the person who asked for it initially, but something as simple as having a list of these business questions or business requirements is really, really valuable for some reasons I'm going to get into a little bit later. So, what if you don't have business requirements today? I'm going to give you a really easy, foolproof way to create a list of business requirements. And this is one of the things that I think everyone who's listening to this should do the next opportunity that you have.

The first step is to review your current implementation. You could look at your SDR if that's what you have, or just look at all of your settings in your current implementation, and I want you to reverse engineer it. Think of this as playing a web analytics implementation Jeopardy. Take your data points and put them into the form of questions. So, if you start with that, then you'll be able to have a list of maybe 40 or 50 business requirements just based on what you can do today. Then I want you to add to that any business questions or requirements that you can think of that your team knows aren't in the implementation today. Maybe that's just you having a brainstorming session. Maybe it's you and your team, looking at the last three months of requests you've gotten from your stakeholders and adding those to the list.

Maybe it's just using your site or asking some friends or family members to use your site and see what questions they have and see if that spurs some questions you want to add to the list. Once you have a list of maybe 40 or 50 these requirements, then you can go through steps one and two and prioritize them with your stakeholders. You can say, here's what we have and is there anything we're missing here? And tell us which ones of these are really important to you and which aren't important or maybe should just be taken off the list altogether. I have found that when I work with stakeholders, if I ask them what questions our team can help them with, they generally look at me like a deer in the headlights. But if I show them a list of 40 or 50 business questions, it gets their juices flowing.

And I say, okay, well here's the ones we already have. What ones do you think we're missing? Or I have the stakeholders walk us through the parts of the website or mobile app that they care about or that they own. And as they're walking through that site or app I'll ping them with questions and then that will help me get stuff onto the list. So, however you get your list, maybe you end up with a hundred of these, then you can prioritize it with your stakeholders. And the cool part is you can actually look at each line item and say, which of these can we answer today? And which ones can we not answer today? And you will be surprised that you can probably only answer about 25% of the full list once you have it. And that really will motivate you, your team, your executives, everyone to say, wow, here's all these questions that people want.

We can't answer them right now. So let's get on that. And then going forward, it's really important that you make sure that you keep your requirements list updated. That might mean removing ones that aren't important anymore, but more importantly it's going to be as you're in meetings, if you get new questions, you want to make sure you're always adding them to the list. So that's the area around business requirements, but now let's switch gears a little bit and talk about the second problem I see with implementations and that's really focused on data quality. I have seen many data quality issues, and I do a lot of audits of implementations, and this is very prevalent. So why do we tend to see a lot of issues with data quality? First of all, I'll just be looking through reports and implementations, and I'll just see the data just drops off completely, and a lot of times the customer doesn't even know that data has been sporadic or dropped off and that's a problem.

Second I'll see inconsistent data values. I'll see, you know, dimensions where they're collecting just junk data or you could tell that they had some sort of a convention that they wanted to follow, but it's not being followed. Common culprits here I see are really bad page names or pages names broken or external campaign codes where query strings aren't working and so on. Another problem I see with data quality is that people just have the wrong solution architecture. Many times it's because they don't know how to actually use the analytics tool that they've invested in, whether it's Adobe or Google. I tend to live in the Adobe worldb and I see lots of cases where they just don't know that they're doing something wrong. Like for example, tracking internal search terms in an eVar but not using the merchandising capability, which in some cases can actually mean that your data is inaccurate.

So, there's lots of different reasons why people would have data quality issues. These are the top three that I see. It's just the problems I've seen companies running into. So why is this happening? Why are there so many data quality issues? And I've asked companies I've worked with over the years, and I tend to focus on a couple of the big hitters here. So the first one is that I find that a lot of teams are completely exhausted after they've either implemented or re-implemented. Many companies are re-implementing for their second or third time because they haven't done it right the first time and they're just exhausted and they're like, basically, we've done the implementation here, here's all the data you want. We did a quick QA, go for it, and they give it to the analyst and then they don't want to spend the time over and over and over again making sure that the data is accurate.

It's just very time consuming. Second is that they view data quality as a one-time effort. They do it as they're implementing and they get sign off and then they forget about it. They don't go back and do it over and over again, which is why companies invest in tools like ObservePoint, because it is time consuming and it's something that you don't want to do all by hand all the time. Next, there's a lot of times where sites just are continuously redesigned or a mobile app is redesigned, and someone changes something that then breaks all of the tagging. So something that you thought had good data all of a sudden drops off because someone did something that you didn't know about. A lot of times this is because people don't implement the right way using data layers and so on, but for one reason or another, these redesigns can wreak havoc on your implementation. 

And then another thing that I do want to focus on is that executives generally don't care about data quality. And I hate to say that because we think that executives should care about data quality, but what I've seen is that they're so high level that they're into the big KPIs and the big things that are going on in the organization that they generally don't really want to get into the weeds. They don't want to know if an eVar is broken or an S prop is broken. That's just not their domain. But if your executives don't care about data quality, a lot of things suffer. So what happens if you don't have good data quality, or why does good data quality matter? Well, if the business users are using your analytics tool and they run into data quality issues, they're going to lose faith in your implementation.

And that means they're going to not like you. They're not going like your team. They're not gonna want to trust it or use the tool. And if you think about it, you, your analytics team, and your business users are putting all of their reputations on the line based on your data. If there's data and analytics that says, hey, no one is clicking on this hero spot on the homepage, they're basically going to their bosses saying we think we should make a huge change here. And if they're doing that on faulty data, then that makes you look bad and it makes them look bad. So if they get burned a couple times on that, they are going to not want to work with you and they're not going to want to use your data anymore. Also, as we talked about earlier, there's a huge investment that goes into digital analytics, and all of that is wasted if the data quality issues are negatively impacting, you know, the adoption or people just aren't using your tool because they don't trust it.

And really the end of the day, your team reputation is really important. If people at your organization don't trust the data, means they don't log in, they don't use your analysis, then you can just kiss goodbye any hot prospects you have of getting future headcount, getting funding for additional tools or additional resources, because you're not going to have a good reputation within your organization. And I have found that data quality is one of the key things to get a really good reputation for your analytics team at the organization. So how do you get good data quality, and how do you get your executives to actually care about data quality? How do you get your stakeholders to care about data quality? The reason why I said earlier, I don't think executives care about data quality is because they don't ever have to experience the actual pain that a digital analyst does.

If you're a digital analyst and you're looking at a report like I'm showing here and you're in December and you realize that way back in November, the data just stopped collecting, that directly impacts your ability to use the data and to do analysis, but it doesn't really impact the executives or at least they don't see that it impacts them directly. So here's a little trick that I'll show you, a way that I think about it that will help you in the future. If you think about it, there's three key things that we've talked about. There's why do we have a website or a mobile app in the first place? There's a bunch of business objectives that one day someone decided, hey, for our company we have to have a website, we have to have a mobile app and there are some executives that had some high level business objectives they wanted to achieve, whether that's revenue or lead generation or so on.

We talked about business requirements earlier and if you think about business requirements, these are like the questions that will help you understand how are you achieving your business objectives, how is the website or mobile app performing, and then down at the lowest level, when it comes to data points like your eVar, your events, your S props. These are the detailed data points that you need in order to answer the business questions that will then see if you're meeting your business objectives. So you could see that there's a hierarchy here. Your executives probably care about the business objectives, as do your stakeholders. Your team, and maybe some of your stakeholders, care about the business questions, and your team probably is the only one that cares about the actual data elements. But as I talked about in the first section, if you go through the process of having business requirements, it can pay dividends when it comes to data quality, because here you can use business requirements to link the low level data requirements to your business objectives.

So what do I mean by this? Well, let's take a look at the business requirements list that I showed earlier here you could see that I've added an extra column where I've actually mapped out in my solution architecture, what are the data points that are being used to answer every business question. Now let's imagine that say you have some data quality issues with onsite search data, both the metric of searches and maybe the onsite search terms that are being collected. Let's just say that something with onsite search got messed up in your implementation. Well now if you have business requirements and if you have a solution design that actually maps data points to your business requirements, you can now see which business questions or requirements you cannot answer today because of a data quality issue. And if you go back to your stakeholders and say, you remember the hundred questions you wanted to answer, unfortunately there's seven or eight of them we can't answer right now because we're having a data quality issue, and we can't get any movement from the executives to help us or whatever resources we need to fix that data quality issue.

Then your stakeholders are going to go upstream, and they're going to go to their bosses and say, hey, how can we help the analytics team get this working? Because I am stuck, I can't answer the questions that I have. And I've found that that eventually bubbles its way up to the top. So the lesson here is if you can tie your data to your business requirements and then you can highlight your data quality issues, you can try to get more visibility into how data quality issues are actually impacting the whole business, and then hopefully get that on the radar of your executives and get whatever help you need to make sure you don't have data quality issues in the future. So the last one I want to talk about is training. So I've seen that many companies don't invest as much in training as they really should. 

And I think of training as the last mile of digital analytics. Even if you've done everything correct in your implementation, if your end users can't effectively login and use the tool and get the answers they need, then all that investment is really for nothing, and you're perception as a team and the goals and achievements that you're going to get are going to suffer. Now, I've learned this firsthand. Last year, I actually trained up to 500 users at a large organization on Adobe analysis workspace. And I learned something really important. I think analysis workspace happens to be a really cool snazzy tool. I'm pretty good at it, but when I trained end users on it, I was shocked at how difficult they found it. And I realize it's because I'm so used to it. I use the tool every day and they were casual users. 

They only use it every once in awhile. So as I was training people, I would see their their eyebrows crinkle and they'd make this kind of frown face when they didn't understand what was going on. So much so that I ended up writing two blog posts that you could see here. I actually wrote 10 pages worth of content to educate people like you on, here's things you need to think about when you're training people on analysis workspace. And I probably could've done a similar thing for data studio. But we take for granted these tools, and most of these tools can be very intimidating for casual users. So don't overestimate how complicated your users find these tools. Another way to get around this I found is that to engage users in training is instead of teaching them how to use the tool, consider going back to your business requirements list and train them on how to do a business requirement. 

Then there'll be more engaged in the training, and they'll actually have high motivation to learn. So for example, if one of the requirements here is, for visitors using both onsite search and support chat, which terms are the biggest culprit and how often is that happening? What you can do is engage them in a training class where you teach them how to build an Adobe analytics segment here on the left where I'm building a segment where how many people searched and then within five minutes did a chat. So that makes sense to them and they don't realize that they just learned how to build a sequential segment and then show them how to build a table in analysis workspace where they can see that 3.14% of the time that people search, they end up doing a chat within five minutes and if you sort by descending order, you could see which search terms they're searching for when they struggle the most. 

So this is a way that you can train them in in context and they'll really be more receptive and they'll learn more about the tool and how to answer their actual business questions. Another training aspect I found that people skip out on is just helping their users understand what data has been collected. Going into the admin console of your analytics tool, there's usually an area where you can actually describe every dimension or every metric, but people don't fill it out. Just type in some help texts so that people, when they're in a report, they understand what they're looking at. You can also build a data dictionary, so when people use your implementation, they have a document they can go to where they understand what all the terms mean. For Adobe users, I recommend that you actually build this right into an analysis workspace dashboard so they can go into a project and see it while they're in the tool. 

Another tip I'll give you is that most analytics tools, whether it's Google using filters, or Adobe using virtual report suites, allow you to have different flavors of your implementation. So one of the things you can do is make a beginner, intermediate, and advanced version of your implementation where you drag over different components. And so beginners can see a slimmed down version of your implementation so they don't get overwhelmed. And then you have an intermediate and then maybe everything is in the advanced version. Here you could see that this implementation has 951 components in it. That might be way too much for a beginning user. So you can let them ease their way into it and not overwhelm them. And then finally, one of the things that I like to do just cause sometimes I'm a little bit of a jerk is I work with companies to actually build an exam that says, you know what, if you want to have an ID or a login to our analytics tool, you have to both learn the tool and learn about our implementation. And so we're going to give you a training, but then we're going to make you take an exam. And if you don't pass the exam, then you're not going to get a login. If you give someone a login and they don't know what they're doing, it's kind of like having someone walking around with a loaded gun who doesn't know how to shoot. They can do real damage there. So that's another training technique that you can use. So in summary, the three homework assignments that I would give you, touching upon the three biggest problems that I've seen in getting organizations to have good implementations or not fail at their implementations are these: First, going back to the business requirements section, I would highly recommend that you create a list of business requirements, not an SDR, an actual list of questions or requirements that you vetted with your stakeholders.

You can even get them to sign off on it, and I've outlined earlier the specific steps you can do to do that. If you haven't created that list so far. Second, I would highly encourage you to go through all of your data points on a regular basis. Really check the data quality, and identify which of the business requirements from step one you can't answer today because of data quality issues, and I'm sure the folks from ObservePoint would love to help you with that in any way possible. Third, review your current approach to training and get some feedback from your end users. Do they really like the way you're training them? Are you giving them any training at all?  And I would highly recommend that if you do build a business requirements list, think about doing training, not in the tool, but in context of your business requirements.

And I promise you that people will be much more eager to come to those classes and they'll be much more motivated to learn what you want to teach them. So those are my top three reasons, or things that you can do to try to avoid failing at digital analytics. Once again, my name is Adam Greco. If you have any questions, I've put my email address, my phone number here. There's a link to my LinkedIn page if you want to connect, or if you point your camera at the QR code, that will take you right to my LinkedIn page. And I'm happy to answer any questions that you have. So thank you so much for your time, and I hope you enjoy the rest of your Virtual Analytics Summit.

Previous Video
Shawn Reed, Blue Acorn iCi – Comprehensive & Accurate Data: The Cornerstone of a Successful Personalization Program
Shawn Reed, Blue Acorn iCi – Comprehensive & Accurate Data: The Cornerstone of a Successful Personalization Program

Learn how to deliver a more personalized experience to your customers with Shawn Reed from Blue Acorn iCi.

Next Video
Chris O'Neill, ObservePoint – Perfecting Release Validation: Catch Errors Before they Destroy Your Data
Chris O'Neill, ObservePoint – Perfecting Release Validation: Catch Errors Before they Destroy Your Data

Learn how to use release validation to identify analytics errors in a staging environment before going live.