Building a Data Foundation for Accurate Insights

July 16, 2020

One of the main challenges of attribution and actionable insights is maintaining clean, trustworthy data. With so many tools, tags, and teams affecting your data, ensuring data quality can seem like an impossible task. In this session, Matt Crupe and Eric Hansen will discuss how upfront data standardization and an analytics implementation built around business objectives can help you achieve the accurate, actionable data you need to drive your business forward.



Eric Hansen (00:07): 
Hello, and good morning to those on the West Coast, and good afternoon to those on the East Coast. My name is Eric Hansen. I am the Manager for Digital Analytics Product at Western Governors University. And I am presenting today with Matt Crupe on the topic of building a data foundation for accurate insights. 

Matt Crupe (00:29): 
Hi everyone. My name is Matt Crupe. I'm a Senior Technical Consultant with Adobe and looking forward to this informative joint session webinar with Eric and myself. Thank you all for joining. 

Eric Hansen (00:42): 
Yes. So as we dive into this topic, it sounds very big and very overwhelming, but we'll try to break it down and talk about the problem that is behind this need of building a data foundation. The problem we're trying to solve is making good decisions for your campaigns and for your marketing mix that can range from tactical level decisions all the way up to strategic marketing mix and budget allocation decisions. Associated with the problem, there's multiple questions that come to bear first. How do I build confidence in campaign performance measurement? Where do trouble spots and opportunities exist in my tactics? And what is the performance distribution across each layer of marketing at our campaigns, at our individual ad groups, across different channels? These are all types of questions that our marketing stakeholders need answers to. 

Eric Hansen (01:46): 
The symptoms of this problem include inadequate and inaccurate data. At the top of that list is inconsistency. When you're missing a common framework approach to measuring and reporting on your campaign, and marketing analytics teams, even within marketing, have different perceptions of those campaigns and definitions and approaches to their marketing, that results in a lot of inconsistent approaches and inconsistent interpretations. Furthermore, those data are often incomplete because digital is usually easier to measure than offline sources. The amount of effort that's put forward towards measuring these different marketing activities will be different based on how easy it is to track. Third, teams are rightly concerned with measuring and reporting on their own performance, but that naturally creates silos in an organization and marketing, as we all know, it doesn't take place in a vacuum. So all of these different approaches to reporting and tracking will often result in a fragmented narrative and make it much more difficult to make those good decisions that need to be made. 

Eric Hansen (03:01): 
Fourth, data has different levels of granularity and what different groups feel is important to measure is different. Some may only track the overall campaign performance while others might track the individual link in a single marketing email. One is obviously easier to implement than another, but digging for insights is incredibly difficult. If the granularity of data is not available, and consistently available, across all of your different marketing activities. Last, the data delays that are introduced by non real time data sets, and how quickly we are able to get at the data, often leaves us as analysts and supporting personnel, trying to back into the information and insights that we want. And that can be a real challenge and further reduce confidence in the data that we're working for. 

Matt Crupe (03:59): 
So I'm sure most of you have experienced most of, if not all of, those symptoms that Eric just outlined. So how do we address the issue? And the answer is data governance. So the traditional definition of data governance is the coordination of people, processes, and technology in order to manage and optimize the use of data as a valued enterprise set. But from a company's perspective, the idea of data governance typically looks like this jumbled set of words here that we see on this slide, where there's a lot of different ideas, goals, and responsibilities, all without any sort of structure or well-defined process on how we can get from point A to point B and ultimately drive value for the business. So why do we need to get data governance? In addition to, you know, all of the symptoms that we outlined previously, it's because the vision of our analytics implementation is quite often different than the reality. 

Matt Crupe (05:24): 
And to be a data driven organization, you need to do three things. You need to develop trust in your data, need to then drive action on that data, and then you need to create value for the business. And in order to achieve those three things, we're going to take a look at the data governance maturity model, which consists of four pillars, Organization and team readiness, data collection strategy, data health, and data democratization. So combined these pillars make up the foundation required to build trust, drive action, and create value to support a data driven organization. So today we're going to do a deep dive into these first two pillars, organization and team readiness and data collection strategy. 

Matt Crupe (06:22): 
So under organization and team readiness there's seven key factors here. So the first one being a well established analytics program, leadership sponsorship that's aligned with it and marketing teams. Next, we have a data governance steering committee, which holds the organization accountable for adherence to all of your data governance policies. Then we have staffing levels, organization and roles that effectively support an enterprise analytics strategy with key roles, such as data stewards, program leads and a CDO, a well-defined plan for your analytics program and strong ownership in key pillars of the data, health documentation, data collection, and data adoption. Also, an analytics center of excellence, which fully supports the enterprise analytics needs. Power users should be identified, enabled, and empowered to support the technical and business needs of the organization. And then individual leaders are held accountable for all areas of analytics, management and adherence to data governance practices. And so next I'd love to hear Eric about how your team was able to develop this at WGU and how you overcame some of the challenges that come along with this particular pillar. 

Eric Hansen (08:04): 
Absolutely, Matt, I'd be happy to talk about our experience in going through this process. As you can tell that list of seven items was quite long, you know, there's lots of different pieces that need to be put into place. But what we found at WGU was that there was really two key components that needed to get off the ground to get our program up and running. Um, at WGU, we did not have a dedicated CDO. Our president, however, is of Amazon stock. And as a key proponent in our transition to a data driven organization, our CMO was the main executive who was willing to expand the budget and leadership in this initiative. And that is a crucial first step in activating towards a data driven organization, that is garnering executive sponsorship at the highest level that you can now, granted those executive sponsors won't have the time investment to be able to help with setting up those pieces of the organization in the practice, but they will offer much needed support and cover as you are growing that practice in your own organizations. 

Eric Hansen (09:11): 
After we identified that executive sponsor, we then identified the director of optimization in our marketing department to be the business owner and partner on this initiative. And in working with her, I was hired on, in a new role to provide the technical leadership, having business ownership and business leadership, as well as technical leaderships are key components. As mentioned by Matt, as far as setting up those steering committees and the analytics center of excellence, having folks that are dedicated to that effort, and these are critical roles that must be filled before you start, you need the right people in place to back you up when the inevitable conflicts and points of friction occur. However, we were just starting out and we did not have the resources to build a full fledged team from the get go. 

Eric Hansen (10:07): 
We had to build grassroots support and adoption. And we found that the best way to do this was to appeal to their self-interest through the establishment of a data governance committee involving key stakeholders from technical and development, operations business users of our different marketing tactics and technologies and the legal team, especially those that focus on privacy and compliance. When we approach the technical operations and development team, we talk to them about the benefits of meeting together to exercise data governance that would allow us to optimize the tags that we put in place, how we go about tracking the information that we need to track in such a way that would provide a net positive impact on our page load performance, ensuring that we never have tags that are unnecessary or out of place. When we talked to our business unit users, we provided this forum, the data governance committee, as a place where concerns could be addressed and needs could be expressed and prioritization can occur. So that everyone's on the same page as the different opportunities and technologies that are available to them to help get the data they need. And then, last but not least, is certainly the legal team. They were drawn to this committee by the opportunity to manage the organization's risk profile through understanding what the privacy and compliance implications of all of our technologies and approaches and analytics practices were. 

Matt Crupe (11:45): 
Thanks, Eric, I really like how you called out the page performance concern. I think that's an excellent example of making sure that, you know, IT and the development teams are also aligned with all of the goals and tied into the analytics implementation. I think that's definitely a big concern and a growing concern lately, amongst organizations. I know it's one that I've been having to help address quite a bit recently. So next, we're going to kind of drill into the data collection strategy, which consists of these six six areas that you see here. So data value, data collection and compliance, data architecture, documentation, project intake, and accountability. So let's take a look at data value first. So by having documented and shared KPIs across the business, you're ensuring that all users and business units are defining measurements the same way that there's consistency and standardization in how you measure performance. And it's key that all parties are on the same page with the business goals and drivers for achieving these goals. And by tying each component of your analytics solution to specific KPIs, you help ensure that the data is answering the questions that the business is asking, and that you're collecting only necessary data and not inundating your system with unusable data that could potentially cause confusion and or have cost implications. 

Matt Crupe (13:38): Next up, we are going to talk about data architecture. So once you define what data is going to be collected, you have to make sure that the way it's stored and shared within your organization also fits your business's needs. If you were considering the reports suite architecture when deploying new analytics solutions, then you're keeping the quality of your data intact while business units may be siloed. It's important as the analytics owners to advocate and act on behalf of all users, regardless of the business unit, and not all data needs to be collected in the same dataset. By having a report suite architecture defined users will be able to clearly distinguish where different data sets live and which one they need to use to answer their business questions, whether it's regarding their website data or mobile app data or what have you. So I'd like to ask you, Eric, like, how did you guys tackle data architecture concerns and challenges within WGU? 

Eric Hansen (14:52): 
What we realized was that WGU was beyond this level of data architecture. We needed a common taxonomy to help us understand where to go next at a technical perspective. This is a crucial prerequisite because the best data collection in the world can't help our organization. If we're all speaking in different language, there's many different things to keep in mind here. So we, before we dive too deep into building out the individual report suites or naming conventions, we worked on gathering together, creating a hierarchy of our marketing channels and different pieces of metadata that are relevant to us, gathering that together in a simple spreadsheet and negotiating as to what should be included as a particular channel or a marketing activity that we were interested in tracking. Having that served as a dictionary or a reference guide of sorts for when we started talking about our business requirements, our technical specifications, everyone was speaking the same language. What this would look like at your individual organizations would certainly be different depending on what's important to you. But we had messaging pillars. We had topics that we wanted to track. We had specific targeted devices. Are we offering a mobile campaign versus a desktop campaign? All of these different bits of information need to be outlined and delineated in a clear and easy to understand approach before you get too far down the road of developing a technical implementation. 

Matt Crupe (16:33): 
Excellent. Next up, we'll dive into collection and compliance. So you have to think about how the data will be collected and make sure that all privacy and compliance standards are adhered to in the process. Strong governance around how data should be collected should be scalable and driven by your organizational goals. Documenting all compliance related data collection activities will expedite the process. If future audits are needed or changes occur in data collection and compliance laws, when your analytics solution adheres to compliance standards, it's going to be easier to keep up with policy changes and regular audits. So making sure your data compliance standards are being maintained in solution documentation is going to be a very important part of the data governance model. And that brings us into our next area of focus, which is documentation. So solution documents are foundational to a successful data collection strategy. 

Matt Crupe (17:49): 
And they're a critical piece of an analytics solution. If data compliance standards are not being maintained into your solution documentation, your organization is going to run the risk of disparity between all of your implementations, which will increase the likelihood of having to perform additional discovery work, which is going to impact time and cost for any new implementations or audits. And so you can see here under the steps section, we outlined three of the critical pieces of documentation of the business requirements document, or the BRD, which is a document that outlines all of the specific tracking requirements for your organization. It spells out some of the questions that the business is asking and what type of analysis you're looking to be able to do with the data that ultimately gets collected. Then we have the solution design reference or SDR, which is a more granular document that typically outlines all of the unique variables and events that are going to be used to satisfy the requirements outlined in the BRD. 

Matt Crupe (19:02): 
This is going to be key to maintain over time. So, you know what is making up your implementation? How many variables are you using? How many do you have left to do satisfy future requirements and things like that. And then finally we have the technical specification, which is going to be a very detailed document that outlines how the entire implementation is put together from the reports suites that are being used to marketing channel configurations to your tag manager and how each piece of data is getting collected. And so next, I'm gonna hand it off to Eric to talk a little bit more about how WGU has implemented these types of documentation into their methodology. 

Eric Hansen (19:52): 
Yeah, thanks, Matt. I know there are many ways to achieve these documents or to build these documents. I think the most important thing to keep in mind is that BRDs and SDRs these tech specs, these acronyms, these are all living and breathing documents and finding what works for your organization may evolve over time as your organization grows in its analytical maturity. When we first started out, you know, oftentimes you'll see BRDs as an Excel spreadsheet or a Word document, but we use the template like the one you're seeing on the screen here of facilitating a conversation of sorts to help them bridge the gap between what their role in marketing is or their analytical role is and how we can go about helping them to fulfill it. And what we found to be most helpful though, was actually the introduction of mockups during the business requirements gathering phase, to give people a sense of what the final desired end state would be, what the final reporting output might look like that helps people give something to tangible, to react to, or to give feedback on. 

Eric Hansen (21:00): 
And this follows one of the core principles of design thinking and is very useful when you're able to prototype and make sure that people are heading in the right direction when you're architecting your analytical practices. In terms of solution design references, there are multiple tools that are available to you and Excel documents are great, but it's also really difficult to keep on top of the changes that need to be made to that documentation. So a stale SDR is of little value to you. So it's important to make sure that you develop the different options, develop the tool sets that you use to make managing the SDR easier and more accessible to everyone. There's the Adobe Analytics health dashboard. That's what we started with at WGU. You can acquire that from Adobe consulting services. If you're interested, I know it's out there. We then moved to Google sheets, ObservePoint labs add on, which allowed us to not only download the data from our Adobe instance, but also upload changes back to Adobe. And most recently we integrated with our institutional wide data governance tool set, which is called Collibra. And that has been a real benefit to us in making sure that we have a proper infrastructure support for improving our data governance approach. 

Matt Crupe (22:27): 
Yeah. I loved your description, stale SDR. So while creating these documents in the beginning is extremely crucial, it's also crucial that you stay on top of these documents. Like you said, Eric, treat them as living documents and make sure that they're continuing to be updated throughout the life of the implementation. Next step, we're gonna dive into project intake. So as your data collection strategy program gains momentum, you're naturally going to experience an increase in analytics project requests, and a formal way to manage the request is going to allow for prioritization resource allocation and expectations management. Project intake is going to help reduce those "track everything" requests and teach the organization how to be smarter about what they track. And this should lead to conversations about how the request tracks back to company KPIs to help kind of filter out some of those less crucial requests or those requests that ultimately aren't gonna really drive any value for the business. And, as site functionality and business needs are ever changing, a clearly defined process for requirements gathering and implementation change request is going to be vital. So every organization's intake process is probably going to be different. But Eric's going to share a little bit about WGU's process and how it will, or hopefully maybe be able to help you in your organization. 

Eric Hansen (24:09): 
Absolutely. What we found was that we set our expectations up early on using the data governance committee to communicate those expectations and manage the overarching process. We used agile analytics or agile scrum to manage the requests and prioritize the requests as we received them. And we used something like Google forms. Well, right now we're using Microsoft forms to intake all those requests, making sure that we ask the questions that are needed right upfront to reduce the back and forth that happened commonly between for analytical requests. 

Matt Crupe (24:45): 
And finally we get to accountability. So, overall the analytics team is fully accountable for data collection, strategy and practices. And by the analytics team, I'm referring to kind of the boots on the ground, the analyst and the technical team members that are responsible for the implementation. And then they should be reporting back to the analytics leadership team. And overall, everyone should be responsible for adherence to the data governance program practices that ultimately get put into place. And then a part of every implementation should be testing and validating. And this should be kind of an ongoing process as you go throughout the implementation phase of a project all the way through to ongoing testing to ensure that data is continuing to get collected the way that it was designed to be, that the data that's coming in is still driving value, and that there's no issues with the implementation. And this is where we kind of come into using and configuring automation tools to help you with this process on a regularly scheduled kind of a process where you can leverage some of ObservePoint's tools like Touchpoints and Web Audits and Web Journeys to constantly be testing your implementation and ensuring that the quality of your data is continuing to drive value for your business. 

Eric Hansen (26:34): 
And what you see on the screen here is just an overall timeline of how we went through our own processes of assigning our stakeholders and defining our taxonomies all the way through gathering and delivering on valuable information that was sliceable and diceable into different channels, platforms, and other pieces of metadata. And the main benefits that we achieved from this was that we were able to identify those different slices of marketing activities that worked well versus those that didn't, we supported reporting at both a tactical and a highest level, inspiring more confidence. We combined this with tag governance, which reduced wasted page load performance. We drove efficiency and performance metrics over the long run in a positive direction, and it laid the foundation for multi-touch attribution to come into play, whether you're using Adobe Analytics, Attribution IQ, Strala JourneyStream, or your own data warehouse. And with that, we'd like to thank you for your attention and time today. If you have any questions, feel free to reach out to us via email or LinkedIn. I'm also on #Measure Slack. Thanks for your time.

Previous Video
The Intersection of Knowledge & Data
The Intersection of Knowledge & Data

At the Marketing Attribution Symposium, John Neeson discussed four critical company values that drive effec...

Next Video
Preparing Your Data and Your Organization for Attribution
Preparing Your Data and Your Organization for Attribution

The success of your attribution is dependent on the quality of your data. That's why David Kirschner presen...