Simon Pilkowski, Boehringer Ingelheim – Analytics Is Not a Tool but a Process

January 16, 2020

You bought all the best analytics tools, but what if they aren’t delivering towards your expectations?

In this webinar, Simon Pilkowski, Global Lead of Digital Performance Management at Boehringer Ingelheim, will discuss how analytics is not a tool, but a process. Specifically, he will address how to:

  • Define analytics goals with KPI hierarchies
  • Deploy and optimize your analytics implementation
  • Match your analysis to company goals

Fill out the short form to access the content on-demand.

Welcome everybody and thank you very much for attending my talk on Virtual Analytics Summit 2020. The talk is all about analytics is not a tool but a process and, to be open, with every digital project you usually have three pillars–you have people, process and platform. And I will not be emphasizing on the people today, but, I need to admit I'm a big ambassador of Avinash's 90/10 rule, which states if you have 100 bucks, spend 90 on very intelligent people and 10 on a tool. But those are the words I will be mentioning about people and tools because it's all about processes here and it will not be so process heavy. So please don't be scared and stay on the talk and hopefully we will also have some questions and I'm happy to answer all of those. 

So just to quickly introduce myself, my name is Simon Pilkowski. I'm the global leader for digital performance management within Boehringer Ingelheim Animal Health. We're a globally acting animal health pharmaceutical company, but also human pharmaceutical company. And my role is basically to help business stakeholders throughout the animal health business unit to get access to data, understand data, and ideally, obviously, make meaningful decisions based on the data. So this is basically my role throughout all digital channels in the global head of Boehringer Ingelheim. 

So let's kick it off and let's start and why do we all need a process to follow in order to deliver outstanding experiences for our customers? First of all, because it needs to be informed by data. We want to have deep insights about our customers. And when I tend to explain why the importance of a process with data is something so important, I usually take this picture as an example, also for internal presentations, of a relay race, where the baton is our data. And we always need to know where the baton is because we can be as good as we want, but if we don't have the baton, once we pass the finish line, then at the end of our data we will not be able to succeed.

And usually people ask me very often something like this, why can't I just use [ ___ ]? And then you can basically replace the tool's name in the brackets. And as I stated before, we will be talking about the process here. But the problem with a tool is a tool will not solve for lack of capabilities with people and also for lack of processes. Because I tend to say a fool with a tool is still a fool, but a fool with a process at least knows what to do next. And I don't want to say that everyone who's not mature in analytics is a fool, but if you have less maturity when it comes to analytics, at least the process will guide you through everything and help you on your particular journey, what to do. So let's kick it off and let's dive a little bit deeper.

10 steps to analytics success. And that's the process I want to show to you. And very important for me, it's not dogmatic, it's food for thought. This is how we're establishing it. But I'm not saying that's correct and all the detail is in here for every organization because not every organization is copy paste. You have individuals, stakeholders, you have individual processes, you have individual platforms. So really hopefully you can pull out some information from this talk and see how we're doing it, how I try to live and socialize analytics as a process. And basically it's three folded. We have a planning phase for analytics, we have an execution phase for analytics and the utilization phase. So that's the only process slide you will see today. So first of all, the planning phase, and I need to admit the slide will come up twice. So you will see two slides, but it's twice the same slide. Um, so first of all, when it comes to planning, we need to define our goals. We need to set up KPI or PI hierarchies and analytics framework, a tagging plan, and we will move into the individual steps one after the other. Then from a tagging plan we're moving into the execution phase, which is then more about technical analytics, implementation, quality assurance and performance reporting. And then finally the fun begins. It's really about analysis, insights, distribution and insights to action conversion. So let's go through the individual steps and I will elaborate on all of them a little bit so you can get an understanding how the process can be brought to life. 

First of all, define your goals and I think we all know if we want to achieve something, first of all, we want to know and we need to know what we actually want to achieve and what's the road which would bring us to a final destination. And it's not really only about defining your goal for your analytics practice, but it's also helping your business stakeholders. And that's actually more important even to clearly define what they want to achieve.

So I don't know if it's repetitive for everyone on this conference or hopefully it is, but for those of you who are not familiar with it, let's quickly talk about SMART target settings because this is something which is critical to success if you want to really make your goals tangible enough to really then act upon what you've resulted also in. So first of all, specific specific, your goal needs to be really specific. And I will give you an example afterwards. Secondly, and this is I think, the most important piece for an analytics community, it needs to be measurable because if we can't measure anything, we cannot act upon it. Third one is it has to be attractive. We need to make it really attractive for the organization. Otherwise, why are we doing it? The next one is it has to be realistic. If we set a time and we will never achieve, why are we setting it actually, and then it needs to be a timely horizon for the target. So you have to also have a deadline when you want to achieve it. And to give you an example, maybe not the digital marketing example, but let's say you have a New Year's Eve resolution to lose 10 pounds, and you have the possibility to say, okay, I want to 10 pounds in 2020 and that's how your target is defined. Hmm. I don't know if this is the best way, or I know it's not the best way, to achieve what you want to achieve. So you need to be a little bit more specific. You need to be a little bit more realistic, attractive, measurable and timely. So what you should be doing is for example, saying, okay, I want to lose 10 pounds until the end of 2020 by exercising twice a week and going to the gym after work. And that was very specific. You can measure whether you lost the pounds, whether you went to the gym twice a week.

Isn't that attractive? Yes. Because you want to lose the weight. Is it realistic within one year, 10 pounds? I would say yes. And it's very timely. If you want to make it even more tangible, you might also break it down maybe into quarters or half a year to see whether you're still on track with your smart target. But that's basically all about definition of goals. You as an analyst have very often to support your business stakeholders to define their goals. 

The next stage is then KPI hierarchies. And what you can see on my screen is basically a helicopter cockpit, which is a little bit overwhelming for myself as I'm not a pilot. And from my perspective, there's too many goals. There's too much color. And what I want to emphasize on here is first of all, you see the K in KPIs. So key performance indicators is in brackets because not every measure is necessarily a key performance indicator for your organization. If you want to learn more about this one, there's an interesting blog post of Tim Wilson. It's titled as something like "The K in KPI doesn't stand for 1000", but on the other hand it's also the human side of KPIs and for us perception is reality. So sometimes maybe it's even worthwhile when you're looking at a large scale organization to really allow maybe for some more KPI, although I'm a big fan of not having 1000 KPI because then what do you consider the key KPI, which is then the key key performance indicator. 

So, what I'm usually looking at then is, KPI hierarchies or PI hierarchies. And basically you're going from functional levels, from an executional level to a very strategic level. And the lower you go on a pyramid from the C suite down to the individual functional levels, the more performance indicators you will have obviously because individual functions have different performance indicators. But also importantly here is um, it's not necessarily only performance indicators because as we said, perception is reality. So for me something might be really meaningful. So for my certain function, it might be a KPI, but it will not be a KPI for the C suite. And let me give you an example. So let's take an eCommerce business as a simple example. And you're part of the IT department. So one of your KPI might be percentage of website uptime. Yeah. So how often is your website up? Let's not consider any time down. This is something very crucial and there is an SLA for the IT department and this is a key performance indicator for this function. But to be honest, for the organization, most probably not in the next shareholder meeting, the C suite, we're not standing in front of the shareholders and say this has been a very successful fiscal year. Our website was up 99% of the time, but it led us up to a more meaningful KPI for the business. Maybe another example is you're running an SEO team within the same organization and you might say, okay, for me a performance indicator or key performance indicator for my function might be something like organic search. So how visible are you within organic rankings? But also this might most probably not be what will be ending up on the C suite because the C suite is usually looking at financial figures. So you would look maybe at market share, revenue growth, net profit margin. Nevertheless, all those performance indicators or on a functional level, maybe even considered KPI, led up to this overarching one. And as you can see also the C level ones are color coded differently because all of the individual functions should know what the C level wants to achieve. So that's the second step. Providing everyone appropriate success measures. If you call it performance indicator or key performance indicator that's more of a human communication aspect within your organization. Whether you want people to feel more valued or not sometimes. 

The next stage is putting everything in a framework, and I don't want to say which framework is the best. I just put out one as an example from John London. It's a brand funnel framework which puts all your analytical efforts into certain packets into a certain defined structure. So you can see here for example, you have a brand funnel, you have individual buying behaviors of your customers. You have the internal digital strategy, your measurement objectives for this pillar within the customer journey, you have where the data is coming from, something you might want to add as well as which frequency you can get the data and then what might be appropriate KPI for those individual pillars. So let's not elaborate on the content. The takeaway from this analytics framework approach is really you as an organization need to decide, how do you want to structure your analytics practice? Because once you have it set up like this, every time someone approaches you, you can basically open your guidebook and say, no problem. Let me open. And we see here at page 494 and let me look it up and I will pull out what you need and I will provide it to you after your campaign, during your campaign for you automization efforts. That's really critical for you. So that's your Holy Bible I would say.

The next thing is then looking into tagging and that's the final phase of the planning I would say. So, and a lot of presentations actually about tagging. Start with this overview from chief Martech and so marketing technology landscape, it's from April, 2018, so it's not hot off the press but it's still relevant. And basically what I want you to understand is it's a fairly complex ecosystem and you need to choose wisely which texts you want to leverage because at the end if you use too many, it can slow down your website. We all know slow websites have lower conversion rates, lower conversion rates can mean, in terms of e-commerce, lower sales. So we really need to understand which of those marketing technologies we want to also leverage on our website. Plus, we should look at what we've done before. So what are our goals? Which performance indicators do we have? What does our framework say? Because this will help us also to choose and pick what actually is part of our testing plan in the future. 

And one example for a good tagging plan, I actually took it from an ObservePoint ebook because I like it very much, is this one. So you want, if you tag your websites, you want to know first of all, okay, why are you actually doing it? What's the use case? Then you want to provide some information about when are you actually firing those texts? Where are you firing those texts? And for certain texts, as you see here, you want to provide even more information like which variables are being triggered. So here in this table you see some example variables for Adobe Analytics, but this would work for Google Analytics, this would work from a Tomo, this would work for every conversion pixel you're injecting. One column I added to the eBooks table is the business owner because this is critical for me. You need to also understand who the business owner of those individual tags is. It doesn't mean that they actually own the code, but they own the requests to inject something which will inform them afterwards. And only if you know this, you can not end up with something like this, which is a huge garbage collection of tags because when you inject everything stakeholders would ask you throughout the tag management system, I would say as I assume majority of you are using it nowadays, you will be quickly overwhelmed with the sheer amount of tags you have on your websites and what we already said too many tags you might have up having a slow website. Slow websites might end up having low conversions. So what you really need to establish also is lifecycle management for your tags. And I'm not talking about the tool, it's really also a process. Do you every quarter revisit the tags you have on your websites? Do you approach the business stakeholders and say, is this still needed? Are we still using the data? If not, can we clean it up?

So that's the end point for the planning phase. And now let's move into the execution phase which starts with the technical implementation and actually because this is a very, very big topic we could be talking for ages about. But let's quickly jump into technical implementation. There are two things I want to highlight here based on my learnings of large scale companies. First of all, technical analytics implementation is a specialty, so let's let specialists do the job and not perceive that due to the fact that we have a tag management system and we don't need to ask IT everything, that we can do everything on our own. Yes, it's convenient. You copy a conversion pixel from Facebook, from Google into your tag manager, deploy it, but then do you know it works on every browser? Do you know all the conditions which could happen? Do you know the tech loading logic? Most probably not and that's why I emphasize on the topic there is a specialty, it's web analytics, it's analytics implementers, and those people are supposed to do what we're talking about here. The second topic I want to emphasize on when it comes to implementation and also here, especially when you're talking about large scale implementations, is always the question centralization, decentralization. And what I've seen is really decentralization is a big problem. You can have the best documentation, you can have a huge blueprint which is very well described. The problem is, even if you work with a lot of different experts, they all have different ways of interpretation of this blueprint and this documentation. So what you will end up with is different ways of implementation, which is challenging when you're looking at your data. So the recommendation is really, build a center of excellence put the experts into the center of excellence and make them deliver centralized web analytics implementations because this is really the key to high data quality and it's actually the next stage and key.

I wanted to cite a survey of TMM data and the digital analytics association from 2017. 44% of leaders said that analytics teams spent more than half their time accessing and preparing data. That's a shame, right? That's a lot of time wasted. And that's exactly based on topics I mentioned two minutes ago, but so why is it so important that we assure quality? And I want just to visually also explain it to you a little bit. And let's take Lego as an example. Lego, is a very good example for talking about data quality. So usually if you buy a new package of Lego, you look at the box, there is a picture of this beautiful car, all the Lego bricks are sorted within this box so that you can within two to five minutes, depends on the complexity, build up this car and play with your children with this car.

The problem is every time when I build up something like this with my son, after playing, we throw it all into one huge bag and put it into the corner of the room. And now it's being a little bit more challenging, if next time we want to build this beautiful car again. And this is exactly the problem. And looking at analytics from not a Lego perspective, but maybe a campaigning perspective, this might look like this. So here you see an overview. So a word cloud of UTM medium values for a lot of different websites. And what you can see here already is there is no defined set of venues for UTM mediums. So for those of you who are not familiar with UTM mediums, they're basically tracking parameters for your campaigns and leveraged mainly by Google analytics. But other tools can leverage it well and here you see, yes there is a little bit of pattern, there are certain terminologies which tend to be reused, but there is a lot of noise in the data. So that's a huge mess also when it comes to data quality and no tool will solve it for you. You really need to have a defined quality assurance process, have defined list of values you allow. Otherwise you will see such a beautiful word cloud maybe, but at the end it's not beautiful if it comes to analytics. 

The next stage is now, and this is very close to my heart here, I'm getting really passionate about is performance reporting. Because nowadays a lot of people actually talk about data democratization and everyone and it sounds very reasonable. Yeah, everyone should be able to access this data play with data, use data, derive insights. But the problem is all about data literacy. Yes, we as analysts love data and we do our daily business with data and we wake up with data, but how about stakeholders that actually don't care really about data. So you really need to understand also your stakeholders and your maturity stage they are at. Because if you provide data to your customers, so internal business stakeholders, it might not work out because they don't understand you. A simple example, I don't know who already had the pleasure to explain to someone what a visit is, or for Google Analytics users a session, or what a bounce rate is, or what time on page is, and those measures are really simple measures, but if you still need sometimes to explain those measures, don't expect people to be capable of actually deriving a lot of insights from your data on their own. 

The next stage is then, and that's a quote I've taken from another presentation, but I liked it so much and it's cited very often as a dashboard is like a joke. If you have to explain it, it's not good. And that's actually the same with the maturity, right? You can have more complex dashboards as the maturity of your stakeholders is higher. But if the maturity is really low, you should also adjust your dashboarding capabilities. And the last point when it comes to performance reporting is really automate as much as possible. So my rule of thumb is usually if I do things twice, I already think about automation and we don't want to be reporting scores, we don't want to be data monkeys. Right? So if you can automate reporting, go and do it. And this is actually how I usually tend to explain to my business stakeholders why the performance reporting looks as it is. So I try to create a playground for them where they can access their data but with certain boundaries in order to not harm themselves or the business by driving wrong conclusions from the data. So I prepare dimensions, I prepare metrics. And with all of this they can play around. If the maturity increases, they can then move into maybe more sophisticated analysis on themselves. 

And that's actually then the next stage from reporting to analysis and here for an analytical community, I thought, okay, what do you tell them about analytics? And I thought, okay, this is what you see on my screen. Doesn't it look beautiful? But it makes no sense because it doesn't deliver any insight. Yeah. But this was actually a result of me playing around with Tableau, trying to understand performance of over 400 websites and I will not review what the bubbles mean, what the colors mean. But at the end I thought, huh, beautiful, but it doesn't make sense. And that's actually the rabbit hole you might get into sometimes as an analysts. Because first of all, if you want to really move into analysis then and I mean, come on w're analysts, that's part of our title, right? So we want to do analysis and we're curious, but we should ensure that we also reset the process we started with and try to understand whether we still will be answering questions that go beyond what has been answered already. So can you answer more than the reporting already did? Is the data there before you start actually playing with the data, just out of curiosity. And the second stage is, you should be BFFs with one of your business stakeholders. Because if the business doesn't have the question, why should you answer it? And additionally, even yes, I know sometimes you need to also spark some ideas to make people raise the question, but at least bring them on a journey and make them feel accountable that the analysis is part of their demand request rather than just you doing some fancy stuff as an analyst.

But let's assume you did the analysis. You have a BFF. Now the question is really how do you deliver your insights? And here, I just want to emphasize an email is maybe not the best channel. I don't want to know how many reports and emails are not being read in companies around the world. So imagine how much money you could save by just stopping those efforts. So if you have insights you want to distribute within your organization, make it a vital part of your communication efforts, inviting people to meetings and talk about data and the cross functional set up. Because you can be the best analyst but you maybe don't have the subject matter expertise your colleagues might have around your business. So really make it a part of the conversation when you're coming back with data rather than sending out a report and hoping that someone reads it and actually understands what you're telling them.

But let's assume you invited all of them. You made it happen. Everyone likes your data, they understand what you're telling them. Now the question is, and this is a very, very industry, company, people specific topic, but how could you as an analyst actually set targets for yourself? And yes, maybe you will not tell it to your senior management yet. But how can you ensure that insights to action are happening? So you are providing insights and the question is did an action result out of it, or was it just you proposing something? So if you look at the whole process and coming from a digital marketing background or the stages within the process might be a micro conversion. But this slide is your macro conversion. This means if you are changing behavior, if you are changing and driving action, you've been successful. So something that you can think about of without talking to your management, is really how can you start measuring your success based on insights actually implemented after you recommended them. Because only then we're actually really driving changes in analytical community. All the things I talked about before, if there is no action resulting from the whole process, why did we do it? And you could even be, over time, challenged by senior management, which would then say, why aren't we spending so much money if it's not worthwhile driving changes for us? 

So let's wrap it up once again and look at the whole process again quickly. You want to plan your analytics unique, clearly defined goals and more important to help your business stakeholders to define those goals. Set up performance and key performance indicators for individual hierarchies of your organization. And sometimes, as I said, you should maybe allow for more KPI just for the simplicity reason and also make people feel a little bit more comfortable that their performance is actually a key performance and if it's just on this functional level. Then you should put some additional dimensions to your goals and KPI which is customer for example, which is tool, which is frequency of data you are getting, yeah? So you should put it all in a framework and easily accessible for you every time and everyone you're working with. Then, slightly moving towards execution, what kind of tags do you want to implement? Where and how and who owns them at the end? Then you're moving into the execution, so you want to implement your analytics and do it with experts and don't decentralize it. That's my elevator pitch for this topic. Quality assurance. I think that could be a huge presentation which would spend a whole week about quality. Data quality is key to success. Otherwise we all, we'd be just collecting bricks rather than playing with the car. Performance reporting, know your customers and know their maturity. And now you're utilizing the data. So if you want to analyze data, do it with your business stakeholders. They don't need to sit next to you when you're opening up Python or Adobe analytics, Google analytics, but the issue would raise the question, or at least you should make them feel accountable for the questions. And once you have answers, distribute it appropriately and sell it to the whole team. And don't send out only a report, because if you talk about the data, explain the data people perceive you as an expert. 

And the last thing is insights to action. I said it as the last thing. So basically it's our macro conversion. If we drive change we're successful as an analyst, as an analytics community. If not all the process before, is basically just nice to have exercises for ourselves. 

So thank you very much. If you stayed for this talk, I hope you pulled out some information you can leverage for yourself. As I said before, the fool with a tool is still a fool. So let's not be foolish and really focus also on processes. And if you want to reach out to me, I'm very active on LinkedIn. You can reach out to me via email. One caveat I need to say I'm very unresponsive if it comes to sales pitches. So if you send me a sales pitch message or an email, I usually tend to ignore it. Thank you very much.

 

 

Previous Video
Meredith Albertson, Tealium – Personalization: The Struggle Is Real
Meredith Albertson, Tealium – Personalization: The Struggle Is Real

Learn from Meredith Albertson, VP of Marketing at Tealium, about how to gain a competitive advantage in thi...

Next Video
Aimee Bos, Blast Analytics & Marketing – Measure Twice Cut Once: Improve Your Digital Analytics for Better Insights
Aimee Bos, Blast Analytics & Marketing – Measure Twice Cut Once: Improve Your Digital Analytics for Better Insights

Aimee Bos, Director of Analytics Strategy, discusses how to build a data strategy to collect and identify y...