John Lovett, Search Discovery - Data, data, everywhere: Leveraging an actionable data strategy to understand your marketing ROI

October 18, 2018

Data, Data, Everywhere: Leveraging an Actionable Data Strategy to Understand Your Marketing ROI

By John Lovett of Search Discovery

Slide 1:

Hi everybody, I hope you're enjoying the Analytics Summit! I'm here to talk to you today about data strategies and my presentation Data Data Everywhere... Building Executable Data Strategies. So, a little bit, just in thinking about this topic, before I start I just wanted to tell you a little bit about Search Discovery.


Slide 2:


We help companies use data to drive business impact. And that's really how I think about this, you know, using data using all the data available to you and the data that matters, finding out what's really most important are the things that you need to be thinking about.


Slide 3:


So we'll talk a little bit today about what it means to have an executable data strategy. This is a phrase that I coined because often times I found that data strategies really just are these big monolithic creatures that oftentimes sit on the shelf and gather dust. They're hard to be able to deliver. They’re usually multi-year processes that don't often get acted upon. So we came up with the executable data strategy as a way to be able to take a bite size chunk out of coming up with a plan and using data with purpose. And that's really the definition, the simplest definition here, is that it's a plan for using data with purpose.


Those strategies start with the vision of where you want to go. So that's my little telescope there in the bottom corner. And then the end results are your desired outcomes. And that's the moon there. So you’re executable data strategy might be let's put somebody on the moon. Hopefully you'll have more to do with using your data for business impact but thinking about: What is your vision? Where do you want to get to? What do you want to achieve? and What does success look like? So that's really how we think about executable data strategies.


Slide 4:

But a data strategy also includes 3 critical components:understanding your data, architecting your data to make it useful, and then activating your data for business impact.

So that's my full definition for the data strategy, and what I’d like to do today is kind of talk you through some of the modern data challenges and how using an executable data strategy can help you overcome these challenges and really use data with purpose.


Slide 5:

So let’s jump in here. The 3 modern data challenges and I’m betting on that you guys have heard these before in some form or fashion.


Slide 6:


The first thing that we see a lot of among a lot of our clients is data literacy. This is really one of the biggest challenges out there today, is simply thinking about how do we understand data, how do we utilize data, how do we get our fellow business users to understand the data and the analytics that we talk about. Because let’s face it, as analysts, often times people look at us like we are speaking a different language when we start talking about metrics and dimensions and segments. They simply don’t know what we are talking about. So, many companies today are looking to elevate the literacy across their organizations. But it’s no easy task.


Slide 7:


There’s an explosion of data today. And when we think about it, this came from IDC’s data age study. And, you know, when you look at over the course of time, data is just ramping up on this zettabytes. It’s really this massive ramp and in thinking about how do we build data literacy with so many data sources, this data is everywhere. Whether it’s coming from our web analytics tools, whether its coming from sensors. Whether it’s coming from a multitude of devices that we have out there today. It can come from so many different places. So how do we deal with all that? How do we build literacy around it?


Slide 8:

And what it boils down to, is that data literacy is very much a multifaceted thing. And the way we at Search Discovery think about this is in three different terms. So, first you have your metrics literacy, and that’s really about knowing what different data means. Following that, there’s tool literacy, and that’s being able to self-service data needs efficiently and to an appropriate extent. So when does it make sense to go in and get your own data? When does it make sense to have your analyst providing data to some of your business users around the organization? And then the third one, conceptual literacy, is approaching and applying data with clarity and sophistication. This is really the deepest level of understanding at a conceptual level, what you can do with data.


So, for the purposes of this presentation, I’m going to talk a little bit more about the metrics literacy and the conceptual literacy. We will skip tool literacy for now, because there is a lot to dive into there, and I don’t have quite enough time. So, I’ll just talk about the metrics literacy and the conceptual literacy.


Slide 9:

When it comes to metrics literacy, there’s really two levels. When somebody, you’ve probably had these conversations where you say, “Well visits increased but conversion rate decreased.” And when you’re talking to a business stakeholder, them being able to say, “Okay, I get what you’re talking about,” that’s being literate. That’s being metric literate. A general understanding of a particular metric or dimension.


And really, you also want to have a true and accurate understanding of a particular dimension or metric. I know that you guys have experienced this. You talk about something like conversion rates, or bounce rates, or any of the other metrics that you use on a common basis, and you may be saying one thing, but a stakeholder on the other side of the table or the other side of the conversation might be thinking about something else. So really, that metrics literacy is having that understanding. And ultimately you want to get to the true and accurate understanding of what that is. That’s one of the levels there.

Slide 10:

On the conceptual literacy side of things, this is really being able to think about, “Okay we have data, how can I use that data to benefit my business? To drive business impact?” And whether that’s making money, saving money, or keeping customers happy, it’s really understanding the purpose of your KPIs. So many organizations that I talk to say, “Oh sure, sure, we’ve got our KPIs, we’ve got lots of metrics.” But they don’t have targets assigned, they don't’ have details behind their metrics. They have so many KPIs that they’ve really lost the purpose or the rationale behind why they’re tracking those metrics.


So, conceptual literacy is understanding what are the metrics or what are the key performance indicators that actually are indicative of your business moving, of making progress towards those desired outcomes that are talked about. It’s also about being able to clearly articulate and to prioritize hypotheses. So this is thinking, “I believe something, and if I’m right, I’m going to take this action.” The ability to come up with a plan using your data, and to be able to use your data to prove out what it is that you’re looking to accomplish. Now, that’s conceptual literacy. To think through, “I know I have this data, I know I have this information, how can I use it to build my business case or to improve my products or services before they get to customers or before they go out into the field?”


Additionally, when we talk, about conceptual literacy, it’s about having a reasonable degree of comfort with different approaches for using data. So there’s lots of different ways or applications, and increasingly, we’re seeing more and more new ways evolve. So, conceptual literacy, is being comfortable with the different types of data utilization and being able to say, “Okay we’ve used data in testing purposes, we’ve used a control versus a test, we’re able to see that one works better than the other.” Or perhaps, you’re using data to prove out that business case. Or maybe you’re using data to optimize some of your marketing spent. There’s so many different approaches to be able to understand and have an appreciation for those is really what conceptual literacy is about.

Slide 11:


So when we think about “to be data literate means that you must understand your data…” and really here, this is an important concept because this goes back to the executable data strategy.

Slide 12:


So the executable data strategy is a plan for understanding your data for using your data with purpose. It includes understanding the data across all of these channels. So you could have web, email, social, in store data, call center data, enterprise, video, over the top devices, you know, so many different things that sources of data, being able to understand that.


And you should be asking questions of the data. What types of data do we have and what types do we need? Who has access to that data? Do users have a true and accurate understanding of the data? Can they self serve? And is there a high degree of comfort in using that data? Ya know all of this things are about understanding your data, and when you can increase the literacy, all of these things get easier for your stakeholders, for the users at your company, for people that are not well versed in analytics. So that they can get comfortable with that data and start to use it. And that in turn feeds your strategy.


Slide 13:


Alright, let’s talk about the second component here: data silos. This is a major challenge for organizations.


Slide 14:


So, 82%, this is a number that comes from a Forrester study that was conducted on behalf of Dun & Bradstreet.


Slide 15:


And 82% represents the greatest challenge that organizations have in managing data and sharing insights that drive actions across organizational silos. So, these silos are present. So, when you think about the emergence of all the different marketing technologies that are available today. Ya know, so many different tools. You’ve got analytics tools, you’ve probably got data quality tools, you’ve got email marketing tools, you’ve got marketing automation tools, and you’ve got all sorts of analysis tools that are available today to the modern analytics professional. And each one of these sources, often times, creates its own silo of data. So this presents challenges. One of the biggest challenges is that you can’t see across all of the data sources to get a holistic view, and that’s a big deal for most organizations when you’re thinking strategically about data. If you’ve got all of these pockets or silos of data, it makes it difficult to be able to see that big picture.


Slide 16:


So, these silos create significant hurdles for organizations. Only half of marketing and sales decisions today are based on data, and that’s just not enough. When we talk about being data driven, you hear so many companies say, “Oh, we want to be data driven” or “We’re working towards being data driven.” The reality is that only half of them are really doing that today.


This study found that B2B marketers and sellers lack at complete and accurate data. So they don’t get that full picture of what’s happening across their entire organization. An example of this might be, you’ve got your behavioral data coming from your web analytics tool, but it doesn’t match up your transactional data. So somebody may be looking at the website, they’re browsing, but then they pick up the phone and they call your call center to actually place the order. Those two streams, if they don’t match, you don’t have a complete and accurate picture, and your data doesn’t reconcile.


So, this is problematic. Poor data management also stands in the way of accomplishing your business objective, so your outcomes. So if you don’t have a good handle on your data and how it’s working together and how you’re integrated, then you’re not going to be able to get that complete picture, and you’re not going to understand how you’re moving customers from that vision that you have all the way through to your desired outcomes.


Slide 17:


So, to break down these data silos, you really need to do data management. And this starts, it’s a people strategy. It starts with top-down support and self-service analytics capabilities. The ability to put data in people’s hands, the ability to say, “we are going to become data driven and here are the capabilities that we’re putting in your hands to enable this.”


It’s also about integrated processes that encourage consistency and actions. So the ability to say, “We have a system for requesting new data, we’ve got a system for tagging marketing campaigns, we’ve got a system in place for controlling data quality.” And making sure that all of those tags you have on your pages are firing as you expect and when you expect them to.


And it’s also about having the right data and the right technology power to accelerate those insights that you’re looking for.


So, data management becomes a big component here, and the data silos, if you’re able to break those down, if you’ve got some processes in place and a support system, as well as the oversight to be able to manage all of this.


Slide 18:


So, to break down the data silos you really need to architect your data.


Slide 19:


And that’s the second component of our executable data strategy definition here. So, it does include architecting your data. And when we think about this, it’s really four components: the data storage aspects, the transformation, the visualization, and sharing are all aspects of that data architecture that you can be working towards. Each of these components is a part of your executable data strategy, and it will help you to be able to break down the silos knowing: Where does the data live? How are you transforming it, visualizing it, and sharing it?


And this leads to questions like: What data connections and APIs exist? What transformations are necessary for the analysis that we’re trying to do? Are there integrated processes across data sets? And when does data refresh and sharing happen? So when you’re thinking about, “How are we really doing this?”, “How are we architecting this data?” These are some of the questions that you should really be asking yourself.


Slide 20:


Let’s talk about the third challenge that we see out there. And this is trust in data.


Slide 21:


So, this is a different study conducted by Forrester on behalf of KPMG, and what they’ve found is that only 35% of organizations that they surveyed have a high level of trust in their organization’s use of data and analytics.


Slide 22:


So trust is such a big factor. I can’t tell you the number of times I go into a client, and they talk about, “Uh, we can’t really trust the data, we’re not sure about it, data quality is not there, we’ve had an implementation for a while, but the numbers are sketchy.” There’s so many examples of breakdown in trust when it comes to data that it’s rampant across the industry. There are lots of tools out there that can help you to elevate that trust, but you need to be asking yourself at a strategic level, do you trust the data?


Slide 23:


And thinking about right now, trust is a defining factor in an organization’s success or failure. As more and more organizations talk about that data driven effectiveness, using data to drive business impact. If you can’t trust the data and the analytics that you have, you’re going to fail. So thinking about, how do you build that trust? And furthermore, as machines start to work in parallel with people, there is a crystal clear need for proactive governance in order to build trust. So you have to have that. People aren’t going to relinquish control of the machines. They’re not going to trust artificial intelligence and machine learning unless they have a high degree of trust in the data that’s behind you. Trust really becomes a prerequisite for using and activating your data.


Slide 24:


So instead of, “Do you trust the data?” it needs to become, “In data we trust.” Trust needs to be instilled again from the top down by leading from the senior most executive in your organization by setting rigorous strategies and processes to be able to maximize that trust. Oh, of course we’ve got a data quality control process in place here. That builds trust in the data. Oh, here’s the reporting. Here are the sources that I use. This is the way that we govern this data. This is the way that we provide access. People begin to trust that. They know that not just anybody has it. That they can’t pull data from one system, manipulate it, inject byass into it, and present it somewhere else. You’ve got the protocols and practices to be able to manage trust.


Trust is like trusting people. Companies build trust in data over time. This isn’t something that happens automatically. It definitely takes some time to be able to build this up and work towards a level of trust in the data that you have.


Slide 25:


So, “To activate your data, you must first build that trust.”


Slide 26:


So, again, as we think about the executable data strategy, it’s a plan for using data with purpose that includes activating your data. And here we’ve got: the automation, the personalization, real-time processing, and then trust. And the questions that you should be thinking about include things like: Where can data trigger automated actions? What’s a good use for that? Or how do we use data for targeting and personalization? When is the right time to be able to identify customers and to be able to inject messaging that they can trust and that our internal users can see that and have the confidence of the fact that they got the right message at the right time to those consumers? And then how do you build that trust in data? I talked about several of the ways that you can do it. Over time, you need to be able to make sure that you’ve got the processes in place, the technologies that can help you do that to be able to build that trust over time.


Slide 27:


So, let’s look at a real world success. One of the things that I’ll talk about here, this is a multinational energy corporation. It’s a mini case study that I’ll talk you through.


Slide 28:


The problem that this energy corporation had, business stakeholders they wanted to be able to track and understand customer interactions across all of their channels or touchpoints and personalize to develop a best-in-class omnichannel customer experience. But the challenge in front of them, they had systems that they could use to deliver that dynamic experience, they were available but they weren’t configured nor were they integrated. They were disparate and disconnected.


So, two of our big problems here: they had data silos and there was a data literacy problem. They had to be able to integrate these things together to make people aware of what they had in front of them. And of course they also needed to be able to build trust to be able to deliver this best-in-class experience.


Slide 29:


So, when we looked the solution scope that we built, there was really three key components.


The first was the data capture. So, they looked at the tagging specifications, they reviewed the things that they had, they looked at the consistency in their naming conventions. They also worked towards data compliance and the appraisal of the data that they had. They made sure that their tagging implementation was set for across channels. And they were looking at mobile, desktop, and communications. All of these things fall under the category of understanding your data. They really needed to get a good sense of what they had in front of them, what was available in that data capture phase.


The next phase, the data syncing, the processing, and the actuation. So here, they were provisioning, in this case they were using the Adobe tool stack. They were provisioning audience manager and they were integrating with adobe analytics and adobe campaign. They were working towards integrations of syncing the customer identifier with their backend channels and devices. So they could recognize the individual across all of their different touch points. And they were integrating additional customer attributes from backend systems. And finally, they did some testing to be able to make sure they could configure the triggers for different tests so that they could see or recognize different types of users if they came in. And all of this was about architecting the data. This whole process that they went through was making sure that they had the integrations, making sure that the processes worked together, and making sure they could actuate who those customers were.


The third component, the reporting and the content delivery. They went through great effort to be able to create and configure report suites. So they took a lot of time setting this up and making sure that they had the data that they could distribute to their internal user to be able to see and understand what was happening. They created test communications in Campaign so they could push them out via email and push notifications. They tested content in Adobe Target so they could make offers. They were doing reporting and visualizations, and they reconciled all of that data. So this was about the activation phase. Making sure that everything came together, that they had the tools and technologies that they needed to be able to push this out and make good use of the data that they had spent so much time understanding in architect.


Slide 30:


And what this looks like, so again I mention that they were using the Adobe Marketing Cloud, but they had lots of different customer data, and they had data as a service component. They had anonymous real-time data as well as second and third party data. And they were communicating via apps, the desktop, email, push, SMS, and a number of paid media outlets. But for them, the integration came together as they used the suite of Adobe products, they made sure they had a concerted effort and they had a strategy behind this to put everything together.


Slide 31:


And the end results created business impact for them. They were able to do that omnichannel customer targeting and personalization. Their media targeting got better, and they were able to associate efficiency gains in the way that they went out and acquired new customers. They were doing better A/B testing in conversion rate optimization. In addition, they enhanced communications, the triggers they used, the ways that they were able to communicate to their customers were dramatically improved. And then finally, the omnichannel customer level measurement and segmentation got stronger as well. So they had a better understanding of the people they knew, and they had a better understanding of the people that they were getting to know and learning more about them. They could identify anonymous users and put them in segments with other like users and be able to do better marketing in that way.


So this really came together for them. It was a strategic approach. They were understanding their data. They were architecting it, and they were activating its use. And the end result was that they achieved business impact.


Slide 32:


Okay the next real world success that I’m going to talk about now. This is a non-profit american public broadcaster, and again, it’s a little bit of a mini-case study.


Slide 33:


So the challenge in front of this organization: they had weekly reports and ad-hoc data requests that take way too much time and effort to produce. Their programming analytics was a huge operational lift. It took them a great deal of time and most of the analysts that they had on their team, instead of doing analysis, they spent their time cutting data and doing reports. As a result, business decisions tended to be made on intuition rather than data. They filled in that other 50% that wasn’t using the data.


And for them, data literacy across the organization and the member stations was lacking. So they fell victim to some of their challenges there. What they did, the challenge in front of us was to build out a sustainable plan for delivering data, insights, and recommendations to their stakeholders, their producers, and their member stations. We wanted to be able to use multiple digital data sources to perform an analysis that answered their key business questions.


Slide 34:


So, how do we go about doing this? Again, we had to start by focusing the vision. And this is an example of a whiteboard exercise. We actually recreated the whiteboard, but this organization actually had a great vision and they had a great mission as well. They knew that at the core every single person at their organization could speak to what they were working towards, what their vision and their mission was, but they didn’t have a strategy. They didn’t have desired outcomes that they could work towards. So we spent a lot of time talking to them about what mattered to their organization. To be able to discern how do we get from what your vision is to really what you’re trying to achieve.


Slide 35:


And it broke down to some simple things. We looked at the audience, the sustainability, and their mission. And we determined that connecting the audience data and analysis to the local level was very important to them. Additionally, for sustainability, building a scalable solution around their data and analytics was also important. And then defining the impact on their mission, that was another one that they really wanted to be able to quantify. To quantify that mission.


Slide 36:


So, we started by building those desired outcomes, and I’ve referenced this a couple times today. Looking at desired outcomes, really we ask three basic questions. Number one, what are we trying to achieve? After that, what high-level business objectives does this support? And then, how will we know if we’re successful? How are we going to measure this?


So for this organization, we identified five distinct desired outcomes: creating efficiency, delivering analysis, improving team dynamic, governing the data, and supporting their local stations. And each one of these desired outcomes supported one of the three business objectives that we identified, either audience, the sustainability, the mission or sometimes all three. And then for each one of these we identified what the success metrics were. These are descriptions, but we had targets and we actually had metrics behind these so we could identify what those specific key metrics of success looked like. And in this way we actually came up with a plan for them to actually talk about how do we think about this data? How do we start to look at our desired outcomes and make sure that we’re working with purpose utilizing that data to be able to get there?


Slide 37:


So, as we went through, we started planning out a roadmap. And again, think about this not as your point A to point Z roadmap, this is a series of desired outcomes that we’re working towards. Instead of being a linear map that went point A to the end, we actually had more of a subway map.


So, the first one, when we talked about creating efficiency, we used a number of tasks that were assigned to each one of these. So, the first one, E1 there, that was about auditing their data. Taking a look at what they had, what the metrics were there, what they were using and how accurate was the data that they had. So really trying to build trust in that data to make sure that we understood what they had and to be able to create those efficiencies. And there were different recommendations in here, the third one, E3, selecting a BI platform. A way to be able to aggregate data and present that back to their stakeholders.


The next desired outcome that we came up with was delivering the analysis. So, here again the first one was about evaluating the data that they had to be able to set baselines and targets. And then we went on to automated and reporting. There were a number of different recommendations that went in here, but there was some overlap as well.


When we talked about improving the team dynamics, the first one was improving, you’re building the basic skills. Building data literacy across the organization. But it was also about, when you look up at T4 there, that was aligning analysts to support their local stations. So we’re creating efficiencies, we’re improving team dynamics, and we were actually training their teams to be consultants to their local stations.


When it came to the governance, again, there were a number of things. We talked about building your governance council and being able to manage that council. We also talked about having a playbook for governance to be able to say, ya know, how do we actually practice this? How do we make sure that we’ve got the policies and procedures in place such that this organization is able to run with this on their own?


And then when it came to supporting the local stations, again, the first one was about awareness. Creating that data literacy. And then we also had the playbook education. So we not only delivered the playbook, we made sure they understood it. L4 is about the analyst support. And then we got to benchmarks and targets.


So you see, rather than just creating a straight linear roadmap, we had these series of desired outcomes that were running concurrently showing results and actually helping the organization to move forward in a systematic way.


Slide 38:


So, as your kicking back, thinking about your data strategy, I’ll leave you with a couple of parting thoughts.


Slide 39:


Ya know, this is about the executable data strategy. And really, it’s that plan for using data with purpose. It includes understanding your data, it includes architecting your data, and its activating that data. You can look across all of these systems and the processes and the methods for doing all this, but also ask yourself the key questions. Think about how are we going to do this? How can we think strategically? What's our vision of where we want to go and how can we use this ecosystem of tools and technologies, people and processes, as well as the governance behind it all to achieve these desired outcomes?


So, thinking about the whole thing collectively, really that’s where the executable data strategy comes into play. And as i mentioned in the beginning of this, these aren’t monolithic multiyear journeys, we’re talking about building a campaign, doing something, executing it, and seeing results. So two to three months, maybe six months, being able to put together that strategy, having a plan for using your data with purpose, and going through it in a systematic way to be able to see results. And that’s what we’re working towards here.


So, hopefully you learned a little bit more about how to use that data that’s everywhere today and got to know a little bit about how Search Discover thinks on the executable data strategy.


So, thanks everybody, my name is John Lovett, and with that, I will pass it back to you Brian. 

Previous Video
Karen Bellin, Mirum Agency - Data & Analytics Ecosystem and the Marketing Cloud
Karen Bellin, Mirum Agency - Data & Analytics Ecosystem and the Marketing Cloud

Discover ways to refine and optimize existing products through data and analytics.

Next Video
Charles Farina, Analytics Pros - Google Marketing Platform Data Governance Features
Charles Farina, Analytics Pros - Google Marketing Platform Data Governance Features

Google Analytics has many features that are often under-utilized.