A Framework for Actionable Insights

June 11, 2020

Join Matt Crupe, Senior Technical Consultant for Adobe, Chris O'Neill, Solutions Engineer for ObservePoint, and Cameron Cowan, Senior Director of Product Marketing for Strala by ObservePoint, to learn the steps to:

  • Standardize data collection for a foundation of accurate insights
  • Validate accurate martech deployment to collect complete, accurate data
  • Unify data from the end-to-end customer journey for a holistic customer view
  • Attribute success correctly across all marketing efforts
  • Increase revenue with actionable insights


Aunica Vilorio: (00:05)

Hi everyone, and welcome to the webinar framework for actionable insights, which we're going to be doing today. We're excited to have you here and we hope you enjoy the event. Throughout the presentation, please feel free to enter any questions you have in the chat box on the right hand side. We're going to be doing a 10 minute Q and A at the end, so you want to make sure that we have all of your questions ready to answer at the end of this presentation. So to do a quick introduction of our speakers, we have Matt Crupe who is a digital marketer with a background in development who uses strategy to tackle complex technical issues. Prior to working at Adobe, he worked for Numeric Analytics where he overcame the challenges of tracking single page applications by developing an Adobe Analytics module for the angular JS framework.

Aunica Vilorio: (00:44)

Crupe has a deep understanding of Adobe products and specializes in third party API integration and video tracking for Adobe and Google analytics. Chris O'Neill is a full stack web developer who has been in the analytics space for the past five years as Solutions Architect at ObservePoint. Chris helps large enterprises set up automated testing of their analytics scripts with ObservePoint so they can verify custom website tracking is always up and running. Cameron Cowan is the director of product strategy and marketing and ObservePoint and a veteran of the marketing analytics, digital advertising and enterprise software industries. He has joined the ObservePoint family via the recent Strala acquisition and plays an active role in product management, technical marketing, and GTM execution. Prior to his time at Strala, Cameron spent 13 years working for Adobe, the Omniture acquisition and gained experience in account management consulting and technical sales before establishing himself as a leader in the product management, technical marketing and business strategy. His career has included living overseas on multiple occasions and collaborating with marketers and technologists on four continents. So with that, we'll turn the time over to Cameron.

Cameron Cowan: (01:47)

Thanks, Aunica. Happy to be with everyone today, excited to go through this framework and talk a little bit about how both ObservePoint and Adobe are working together to help our customers achieve this goal of getting to actionable insights and really improving your business. For me, this is an awesome opportunity because I started my career back at Omniture and worked up through Adobe after that acquisition. And so seeing these two businesses come together is kind of a nice round trip for my entire career and seeing things from a couple different sites. And I'm glad that we'll be working with both Matt and Chris today. Now, as we start off, I wanted to quickly just give you an overview of the framework we're going to be discussing. We're going to talk through a number of different things, start at a high level and really boil it down to some very specifics that hopefully the group can take away today.

Cameron Cowan: (02:35)

We're going to talk about planning and building your measurement framework out. We're going to be talking about how we implement that and what are the best practices around getting things up and running for the first time. But also how we learn and iterate on that process to activate that data, to get us to actionable insights, not just insight, but things that we can actually do to improve our businesses. And throughout that process, we're going to be talking about testing and validation about automation and how we update that original plan that we established to make sure we're continually nimble on the ground. Now, as I started thinking about the plan cycle, one of the things that I focus on a lot here at ObservePoint is the concept of performance measurement. And we intentionally leave that term high level.

Cameron Cowan: (03:23)

But no matter what type of performance you're measuring, ultimately the reason you do measurement isn't because it's fun or you want the extra work, it's because you want to know what's working, what isn't and why. And so what we're going to be focusing on a bit today is helping people understand where, how do I take the information I'm gathering through my measurement program to know where I should and shouldn't be allocating my investments. And investments can be dollars. It can also be technology and people and resources, and really anything that helps move you toward your end goals. Ultimately it's when you think about attribution on the tail end of measurement, it's the art and science of giving credit where credit is due and knowing how to make those allocations of investments. And so, as we were thinking through this and preparing for the webinar today, I started thinking about, well, what other areas in life specifically for me do I see performance measurement?

Cameron Cowan: (04:13)

And the first thing I remember thinking about very specifically was just sports in general. I've actually had a goal throughout the last 10, 15 years of my life to go to a new stadium or a new venue for professional sports every year when possible. And so the last couple of years from last year, I went and went into the new Yankee stadium. The year before that I went and enjoyed a 49ers versus Giants game out in the Bay area. And part of the reason I love sports is because there is a ton of measurement involved. There's a lot of very specific metrics and there's also end goals. And yet when you watch a football game or baseball game or a basketball contest, you don't necessarily care who has the most rebounds or who's gained the most passing yards.

Cameron Cowan: (04:57)

There's still an ultimate goal of winning the game. And for those organizations, the real objective is to win a championship, to be the best in their field. Now that can be measured in a lot of different ways, whether that's on base percentage or free throw shooting percentage or the number of rushing yards you get over a season. But ultimately you don't win anything by being, you know, the person with most stolen bases in a season. You win it by being able to hoist that championship trophy. And so part of what we're thinking about as we move through the rest of our webinar today is how do we focus on those end goals—on those true business objectives—and why we're doing this while using those intermediary and lower level metrics to get us to where we want to be Another realm of which I know I've certainly been captivated over the last several weeks, and a number of our team members at ObservePoint have really pulled me into this, is just everything that's going on around Space X and their recent launches and tests.

Cameron Cowan: (05:55)

And when you think about that, the same thing is true. There's a lot of information and a lot of data and a lot of testing that goes into this process. A lot of metrics, both prelaunch and during the process, as well as after the astronauts get to space. A lot of things, safety checks, a lot of technology updates, things that are streaming in real time, but the ultimate goal isn't to just get that thing off the ground or just to make sure a certain system is running the way it should be. The ultimate goal is to get these two gentlemen into space and docked with the international space station. Ultimately Space X was wanting to become the first private company to send astronauts into space. And that joint venture between the private and public sector was a big part of pushing that forward. When you think about, even further, what is the ultimate business school there? Really, in the founding documents of Space X, they said they want to start a colony on Mars. And so every one of these milestones and events that lead up from a success perspective really are building toward that ultimate goal.

Cameron Cowan: (06:55)

So when I think about what are the top questions now bringing this to a business realm that ObservePoint and specifically the Strala platform that ObservePoint has brought into the business through acquisition earlier this year. What are the questions we help answer? Really, if I boil it all down, it's all about where and how do we invest to grow customers and to improve ROI? It's always that healthy tension between growth and efficiency. And as Peter Drucker was fond of saying, if you can't measure it, you can't improve it. So a big part of putting a performance measurement plan in place is to be able to understand at a high level, how do I hit that ultimate business goal of whatever it is for my organization. And yet even with a lot of the great things that have been happening, even going back to when I joined Omniture back in 2005, a lot of technologies have existed and continue to develop in a good way.

Cameron Cowan: (07:46)

And yet we found through research recently done by Strala that less than 11% of brands feel confident in the accuracy of their performance measurement program and what they're doing to understand if they are hitting their business objectives or not. And so part of the reason we put together this webinar, and we want to share some of these principles with you, is to help get you out of that that 89% and into the 11%, hopefully growing that so more and more people are feeling the confidence they need to be able to take action on the data they're collecting. Now, when I think about the primary problems or challenges that are hindering us from being more confident and having the trust in our data to take good and meaningful action, it really comes down to a handful of things that I've seen. Like I said, not only in my time here at ObservePoint and at Strala, but even going back to my days with Omniture and Adobe, it's that we have siloed data—that often a lot of different channels are doing their own thing, and aren't measuring in a unified way.

Cameron Cowan: (08:42)

We don't have granular enough information. We're maybe doing a high level measurement at a channel or a publisher or a campaign level, but we're not actually getting the deeper insights that we need. And certainly there's a challenge with incomplete and inconsistent data—how it's implemented and does that implementation stay live and grow over time. So those are all things that we want to discuss today and understand how can we give you the framework you need in order to better deliver and deploy meaningful insights. Well, the last things I want to cover off here is this concept of the historical data life cycle. And I've seen this throughout my 15 years in the industry, but also throughout just about every organization I've worked with. And it's the idea that in order to get our data unified and collected and get to a quick time to value we first start with data collection—help me understand where are all the different places I can and should be collecting that data from my website and what's happening in my web analytics instance from my mobile measurement to even offline interactions—things that I'm storing in an enterprise CRM platform or through a point of sale system or through a marketing automation system, all of these different places where data comes in and we understand how our customers are interacting with us.

Cameron Cowan: (10:01)

And then from there we go through usually a very time and labor and cost intensive exercise of cleansing and normalizing and trying to combine this data. So hopefully we can get it all together to visualize and find some sort of insights. And the core thesis of what the Strala by ObservePoint platform comes to the table with is this idea that we need to be able to predefine and to unify that data before it ever comes in—to be able to set data standards on metadata and what is collected and how it's collected, and do all of those different systems that I just mentioned define those things in the same unified way? And I think as we work with our partners over at Adobe, a lot of this starts to really become clear as they're building out their platform and trying to help customers do this exact same thing.

Cameron Cowan: (10:48)

And I think there's some great alignment on how do we get rid of a lot of that wasted time and resource from cleansing, normalizing and combining data and get people more immediately to those insights they want and need to move their businesses forward. The last thing I'll mention then is that, you know, as we do this, and as I said, there's a lot of different systems involved. I think the average enterprise business that we work with has somewhere around 37 different technologies in their tech stack. A lot of those come from, once again, the Adobe family and how they're bringing those products together. But really it spans channels, it spans interactions. And we want to be able to bring this all together in a single system of record. Now in order to do that, there have to be some really good standards.

Cameron Cowan: (11:30)

There has to be a clear and defined way to implement and move forward. And so today we're going to focus from here on out very specifically on the web analytics side of things. We recognize that the majority of our audience, both for ObservePoint and for Adobe, is focused and has an investment in this area. And so from here, what I want to do is bring on my friend, Matt, to talk a little bit, well actually, Matt, I'll bring you in and as you unmute and add to the conversation, one of the other things I wanted to kind of get your perspective on as we transition over is this idea of business objectives. You know, I had talked about this at a high level, from a sports angle from understanding the space program through Space X, but really, you know, a lot of planning before you ever put code on page or engage with a tag management system is understanding what those highest level business objectives are. And from there building out the requirements that help you understand what will get you to those objectives, really the data elements, what goes into what specific eBars and how it's collected on what pages That's all a function that should, and can trickle down from those business objectives through business requirements. But as you take the range here, I'd love to hear your perspective on how you build this kind of framework into your implementation of web analytics installation.

Matt Crupe: (12:52)

Thank you, Cameron. Good morning, everyone. I'm Matt Crupe, I'm a senior consultant with Adobe. And so yeah, as Cameron was stating, once you've kind of got an idea of what your business objectives are, I think one of the next crucial steps in planning out your implementation is documenting your individual business requirements. So taking a look at what those overall objectives are, and then breaking those down into more specific requirements of how can we achieve those objectives. And so examples of requirements can be things like being able to report and analyze on a customer's journey from point A to, you know, to the final step of whatever you consider to be a conversion. It could be something as granular and detailed as you know, capturing time and day of each interaction the customer makes on the website.

Matt Crupe: (14:05)

And so once you start defining these requirements, it's important to then document those and create what we typically refer to as the BRD at Adobe or a business requirements document, and we'll categorize each requirement. So we'll create a set of categories and then each requirement will typically fall under one of those categories and have a specific ID that we referenced. So each requirement will have an ID and that makes it easy to then tie back individual pieces of the implementation. Later on, as we get further into this discussion, when we start talking about some of the other documents and pieces of a successful implementation, we can associate things back to a specific requirement and it's important to have these requirements documented and defined because it helps us to not lose sight of what the overall goal is and how we have mapped things out to achieve that goal. And so instead of just jumping into, you know, your tag manager or your implementation, and starting to tag things, we want to have that holistic view and idea of what our goals are, how each of the things that we're doing helps us to achieve those goals and satisfy those requirements.

Matt Crupe: (15:38)

And so from there we'll move on to the implement phase. So once we've got all of our, our requirements defined undocumented we'll move on to actually beginning to implement things. And so there's, there's typically three steps to a successful implementation. So the first step is a compliment to your business requirements document, and we call it a solution design reference document. And we'll typically pair this up into the same document as our business requirements. And what the SDR does is it is an even more granular document than your BRD. And it outlines the individual variables that are going to be used to satisfy your business requirements. And so this will outline all of the individual EBRs props and events that are going to be used in your Adobe Analytics implementation. It's also going to outline the, which of the individual configuration settings for those variables.

Matt Crupe: (16:58)

And so if you see on the screen, now, this is a sample SDR or solution design reference, and it will let you know for each individual variable and event things like what's the expiration setting for this particular variable and the attribution. Is this merchandising you know, did we have merchandising turn on for this specific EMR, Eva, things like that. So this document is going to give you a snapshot and be a quick reference for you to go back and be able to see exactly how each of your variables are configured and what all variables you're using to satisfy your requirements. The very left hand column is a column where we pair each variable up with the specific requirement that it is used to satisfy. So you can quickly go back to your business requirements and see what is this variable being used for? What does this relate to? And so now you've got clear documented requirements, as well as a clear document outlining all of the variables that are going to be used to satisfy those requirements.

Matt Crupe: (18:28)

And then the next step is creating a technical specification. And the tech spec is then going to build upon your business requirements and your solution design reference. And it's going to detail how every part of your implementation is going to be configured from your data layer all the way through to Adobe Launch, which is Adobe's next generation of tag management capabilities. And this is the tool that is going to be used to deploy Adobe Analytics, Target, Audience Manager. All of the solutions that are going to be used within your implementation will be deployed through Launch. And the tech spec is going to go through and page by page outline how everything will be configured. So this is something that could be handed off to a development team that could then take it and implement the data layer on the client side and, as well, configure the tag manager. This is also going to be useful for you as your implementation grows and matures. You can go back and continue to reference this document. Any time you have to troubleshoot anything, this document can come in handy because you can quickly see how things were originally configured and what the expected results are, and it can help you in the validation phase as well.

Matt Crupe: (20:08)

And then the final step in the implement phase is to configure tools. We touched about this a little bit on the last slide with Adobe Launch. But for this webinar, we are focusing on your Adobe Analytics or digital analytics implementation. So configuring the tools would consist of your report suites, so going in and setting up, creating and configuring your report suites, the individual variables and events that are gonna make up that report suite, defining and configuring your marketing channels, which will be used to categorize traffic and give you insight into all the different channels that your traffic is coming from, processing rules that will be used in conjunction with your marketing channels or to set individual variables and trigger events as well as classifications on your data to help break out and provide more detail on the data that is coming in through your implementation.

Matt Crupe: (21:21)

And then Adobe Launch. So, like I said before, Adobe Launch will be used to configure and deploy pretty much your entire solution. This would be all of our individual solutions like Analytics, Target, Audience Manager campaign and then also including our new AEP web SDK which is our revolutionary new offering where all of our solutions can be deployed through a single tag instead of having to have separate tags for each of the individual libraries. Launch is where you will conditionally configure rules to collect data based on certain criteria being met, configuring individual data elements that are going to map back to your data layer, as well as any third party marketing tags that you might be using to supplement your overall analytics implementation. And so from here I'm going to hand it over to Chris to begin talking about the testing and validating portion.

Chris O'Neill: (22:31)

Great. Thanks Matt. Matt, I'm actually, I'm going to ask you to come back if you don't mind. As the subject matter expert on implementation with Adobe, I have a few questions for you.

Chris O'Neill: (22:43)

Is that okay? Okay, awesome.

Chris O'Neill: (22:47)

So in this phase, we're going to talk about test and validate. We're going to talk about how we design the testing. We're going to talk about the importance of taxonomy, configuring automation tools, you know, how do we then monitor, expand, and then calibrate, but Matt, you had a really good slide. I'm actually gonna back up here.

Chris O'Neill: (23:10)

When you talked about planning, you talked about business objectives, business requirements, and data elements. At what point in your planning phase do you start to think about testing?

Matt Crupe: (23:26)

So we begin to think about testing typically when we are kind of starting to create the solution design reference, and we're designing the actual implementation, like what variables are going to be used to satisfy these requirements, how are they going to be used? And so then at that point, you begin also kind of thinking about the testing phase and how are you going to test and design your validation plan to ensure that each of these variables and events that you're configuring and using in your implementation are firing correctly and capturing the proper data.

Chris O'Neill: (24:12)

Awesome. Okay. So talk about it here in the implement phase, you talked about an SDR, you talked about the technical specifications document. You also talked about a BRD. Which of these three documents would you say is, would you say the SDR is most important to your testing plans or would it be the BRD or the tech specs or some combination of all of them?

Matt Crupe: (24:35)

I'd probably say a combination of all three, the SDR or solution design reference is definitely critical because that outlines all of the variables that should be collecting data, all of the events that should be firing. So you'll definitely need to leverage this when testing and creating your test plan to ensure that you are accounting for each of the variables and ensuring that each one is firing and collecting properly. But also the tech spec I think is equally important because you can use your texts back as as a blueprint when creating your ObservePoint Web Journey or App Journey, because this, the tech spec typically shows you step by step all of the different rules within a tag manager that should be firing, when they should be firing, the individual data layer variables that you would expect to populate and the type of data that you would expect to see with them, as well as also outlining each of the variables and events that should be populating with each of the rules within your tag managers.

Matt Crupe: (25:41)

So I would say tech spec and SDR are probably the two most critical, and then the BRD is always there to, you know, for you to go back and continue to reference to make sure that all of your requirements are getting met. We have accounted for each one of those and make sure that nothing has gotten lost or forgotten about.

Chris O'Neill: (26:20)

Awesome. That was really great. Thank you. Yeah, I think the most important part of test design, and I've seen this with a lot of clients that try and implement a testing practice from scratch, is they try and just attack the whole thing at once. Matt mentioned talking about the customer journey, understanding from start all the way to conversion. I think that's key in allowing us to prioritize what we should be testing first and where we should start to build it out. Matt also had a quote on his slide that said you're never tagging just for the sake of tagging or for tagging sake. I think we can say that for testing as well. You're never testing just for testing sake, so we should have the objective in mind. So prioritize with the end goal in mind is how I would say that. The second part I want to talk about is taxonomy. Taxonomy is an interesting concept when it comes to testing. Cameron, I was wondering if you wanted to hop back on and have a quick conversation about taxonomy.

Chris O'Neill: (27:27)

Oh, we got him back on. Good. So Cameron, when we talk about testing, how would you test taxonomy? Cause taxonomy seems like, it just seems like a documentation thing. So how does taxonomy and testing and validation, how do those two worlds come together?

Cameron Cowan: (27:45)

Yeah, I mean, one of the things that I've seen throughout my career, going back to when I was using version 11.5 of Site Catalyst back in the Omniture days, even all the way through now, is people use different naming functions, different syntax to refer to a lot of the same things. You know, a lot of my career has been, for example, in the programmatic ad world. And one person would call, you know, from Google, they would call it PPC and on Bing, they would call it SCM and on Yahoo, they would call it paid search. And you know, all three of those things mean the same thing. And so being able to use that unified taxonomy, the similar syntax, a similar categorization really adds an ability to move and move fast on your data. I think there are always, well, there are often ways to clean that up after the fact, but when you talk about putting it in the frame of testing and validation, we want to make sure from the moment go, from when we create those touchpoint IDs and the links that are getting people to our website, we want to make sure they're following that standard taxonomy.

Cameron Cowan: (28:43)

And then over time as things shift, we want to make sure we're validating that metadata layer to make sure it's still conforming to the standard that we have in place.

Chris O'Neill: (28:53)

Yeah. So what I'm hearing is a lot of people today, maybe they don't think of testing and validating taxonomy at the data collection level, they think about it after the fact. Once we've collected all our data, we have this fruit salad, then we'll go sort through it and we'll put all the berries in one basket and you know, all the vegetables in the other basket, except maybe you shouldn't have vegetables and fruit salad. I don't know. So talk to me a little bit more about how you would validate at the touchpoint level, the taxonomy, and maybe you could, we could talk about Touchpoints a little bit here and how that works.

Cameron Cowan: (29:29)

Yeah. I mean, one of the reasons we built Touchpoints was for that very reason that people were going after the fact and trying to clean up their data rather than intentionally standardizing and defining it upfront and then simply validating against that standard after the fact. In fact, one of the main connections between the Strala platform within ObservePoint and Adobe Analytics is that synchronization of metadata. We can take all of the rich amount of classification information that we're creating beforehand, defining as your standardization, your standard taxonomy, and then automatically push that into Adobe Analytics. So you have, instead of maybe just a basic channel, campaign and publisher level reporting, kind of the aggregate view, you can have a lot of really, really rich information, not only from channel, but also content, understanding what are your titles and descriptions and display URLs and destination URLs of all your ads, what are your primary calls to action and promotions and offers and assets that you're promoting All of that can and should be part of that metadata layer that gets you those deeper insights.

Cameron Cowan: (30:33)

But once again, it can't be an after the fact thought of, okay, I need more information because a lot of times, if you don't have that in place, you're tracking isn't granular enough to get you to that level. But if you just put a little bit of extra effort and intention upfront, you can define that standardization layer, supplement it and enrich it over time and then sync it back to your analytics platform. So you really have that data at your fingertips whenever you need it.

Chris O'Neill: (30:57)

Awesome. Yeah. Thanks. That makes a ton of sense. And then even with audits and journeys and ObservePoint, you can start to apply rules on the taxonomy level as well. Not only does ObservePoint validate that the tags are on the page, they're firing correctly, not only do the audits and journeys validate that the variables and the values are being collected. We can really apply taxonomy rules at the value level, we can use regular expressions to make sure that—a really simple example is make sure that the date is formatted correctly across all properties. Maybe we can talk about taxonomy at just such a granular level that we make sure that we are using, you know, all caps or all lower case, so it doesn't mess up our reporting. I was gonna bring Matt back on and ask him if he's ever experienced these pains, but I think I'll let him slide on this one.

Chris O'Neill: (31:49)

So maybe let's move over to monitor and expand. Now, once we have designed our test, we have aligned it with the SDR. We've aligned it with our BRD. We've aligned it with the tech specs. We've considered taxonomy, especially at the touchpoint level, across the customer journey. We've configured our automation tools. Now we need to think about the cadence, the frequency, how often do we want to monitor? A good rule of thumb is, you know, you can start with any arbitrary frequency that you'd like, whether it's weekly or daily or biweekly. Typically I recommend you sync this with your release cycles, but from there we want to think about calibrations. So if we are catching tons of errors and we have biweekly tests running, we should probably up the frequency to do it weekly or daily. Or, maybe we're testing at a biweekly cadence and we don't catch a lot of errors.

Chris O'Neill: (32:43)

Now we can scale back our testing and we can move it to monthly. And that's how I would calibrate moving forward. I did want to talk about a few screenshots I have here. When we talk about testing a couple of different layers of testing. We talked about the touchpoints. There's a screenshot in the previous slide. You can see here the journeys, these are the flow based tests. These are typical regression tests. So these closely mimic the customer journey. So, you know, these could be form, these could be searches. These could be checkouts. We want to make sure that those flows, those customer flows from customer acquisition, all the way to conversion— those are obviously top priority to test. The next type of test we talk about as an audit. So these are good for testing general tag governance practices.

Chris O'Neill: (33:31)

You know, what tags do we have on our site? What pages are they on? What pages are they not on? What variables, values are they collecting? We can start to apply taxonomy here. And then the last step, and this is probably a more mature approach to testing. But once we have our basic testing in place, we've calibrated, we have a good frequency. Then we want to start to go in depth and you can see here on the left hand side, we have a decision tree of sorts. You can see all the tags that are firing. This is what we call tag hierarchy. And this shows how our tags are firing on the site. So not just what tags are firing, but what is initiating those tags, what's being fired inside of our tag management system, what's being fired outside of it. And a really hot topic right now is, you know, do we have control?

Chris O'Neill: (34:21)

And are we testing for rogue or piggyback tags that are showing up that are being initiated through other vendors? Typically ad tech vendors will initiate tags as part of their functionality, but do we have automated testing and monitoring around that so that we can control and understand what tags they're initiating, what data is being collected and where it's being sent to? We won't go into this topic on this webinar, but this testing really starts to dive into the topic of GDPR, CCPA, PII. And these are things that are very important to test for to make sure that we are in compliance and being responsible to our clients, that we are trying to improve their customer experience with all this tracking. I think we're going to go ahead and go to the next slide and I am going to pass it back over to Cameron.

Cameron Cowan: (35:16)

Thanks, Chris. So, yeah. You know, when I think about this entire framework, and once again, going back to some of my early years in web analytics, I think there's this natural tendency for, especially people that are managing these projects, to want to just jump in and start tracking whatever they can, but really having that framework around the entire being able to upfront define not just what should I store and which variables and which data do I need to be storing in those variables, but also what is the overall structure around that? What business questions am I trying to answer based on that data? And even before that, what are the business objectives that those questions helped me get to in the first place? I think that framework for me is the most important thing I could take as a takeaway from this entire process.

Cameron Cowan: (36:05)

Understanding that you're validating not just against your tech spec or against your solution design, but you're actually validating against your entire taxonomy and metadata layer. You're validating ultimately against those business requirements. And is what I'm measuring today on my website, still answering the questions I needed to answer in order to get to the objectives I'm trying to get to? And I think this is a, it's an iterative process, which is also something we forget. You know, John Pestana, who's the co-founder of ObservePoint here was actually also the co-founder of Omniture. And I think he and Josh James in the early days of Omniture, even before the acquisition by Adobe, recognized that to have good measurement and especially good web analytics, you have to have a solid foundation that's thought out and planned out, but they also recognize that over time, that process starts to devolve.

Cameron Cowan: (37:00)

And so updating your implementation and updating the way you're thinking about your entire performance measurement program, whatever that involves, is really also very important. I keep coming back to the concept of entropy as I continue my work here at ObservePoint—that things fall apart and even the best, most pristine laid out, well thought out implementation that I've ever seen of Adobe Analytics over the course of two, three, four years starts to erode. And so it's not just about defining, standardizing, implementing, and the going through validation and testing cycles, but it's also revisiting every one of these things over time. I want to bring Matt back in for a few moments here and just ask you, Matt, you know, as you've been involved with the customers you consult with, how have you seen this process of interview take hold, as far as, you know, they may have had a fantastic implementation upfront, but over time that starts to decay. And what kind of advice do you give to people as you start to see those things happen?

Matt Crupe: (38:01)

Yeah. So this is definitely something that we see quite a bit, you know, it's just, it's a common occurrence, you know, it's just like with owning a home or a vehicle, if you don't maintain it and, and continually, you know, address issues or just, you know, do the proper routine maintenance and stuff to ensure that things are going to continue running smoothly, they're gonna, like you said, fall apart. So over time, you'll have ad hoc or kind of one off requests, you know, something needs to get tagged right away. We've got a new campaign that's going out. We need to update implementation real quick to collect this one extra data point or things like that. And then, you know, all too often as you know, on the technical side, we'll jump in and we'll just make the necessary updates and then we'll test and make sure that everything is capturing, getting captured correctly and nothing is broken.

Matt Crupe: (39:10)

But then we forget to go back and update those business requirements and update the SDR to reflect any new variables that have been implemented. And so then over time your documents get out of date, out of sync. They no longer, they don't reflect the current state of your implementation. And also other things, you know, outside exterior factors can come into play as well. So the developers make some sort of update on their side that ends up breaking something within your digital marketing implementation. And so it's extremely important to continue to kind of revisit these individual steps, to ensure that if and when those things do happen that, one, you're catching them in a timely manner before too much data loss or impact on data has occurred. And, tow, just to, you know, make sure that everything is just constantly running smoothly. And so if you don't continue to come back and update things like this, then you tend to lose sight as well of your overall business objectives. And your implementation does end up suffering.

Cameron Cowan: (40:30)

Yeah. And I think, as you alluded to, business objectives change. No Business stays the same year over year. And so you want to continually have that be a living, breathing process. Now I want you to stay with me as we go through these last couple of talking points here, you know, this gets us to good data, whether you've implemented it right up front or you've come in and you've inherited somebody else's implementation, and you've tried to update it to reflect your business. Hopefully that gets you to that rich level of data you need. It's standardized data you can trust, but the title that we're speaking toward today is actionable insights. And so I've got all these awesome insights for both channel and content performance. I trust the data, I'm finding information that really drives me toward, okay. I know which content is being effective on my website. I know which channels to deploy that through. Let's talk really quickly about activation. And this is the part that I love about the entire story that Adobe has brought together with the experience cloud, as far as not just a foundation of data and content, but also then doing something with that data content. Talk to us a little bit about how Adobe is enabling your customers to take really good data and make it actionable through the systems that you've pulled together.

Matt Crupe: (41:38)

Definitely. So a couple of things you know, I think when we approached this webinar we were approaching it from kind of just a web or digital analytics perspective. and that's kind of what we focused on in the, the implement phase as well. And so as your business goals evolve in your requirements, continue to evolve and change then we start to think about, okay, what are the next steps that we can take to then help mature the implementation and, and bring it to that next level. And so some of the Adobe solutions would then additional solutions would then get put in place. So the experience cloud ID service, this should be something that would be foundational to, to any implementation, no matter how basic or advanced but this helps us create evolving visitor profiles to, to help deliver those people based experiences.

Matt Crupe: (42:44)

Instead of focusing on, you know, identifying individual devices, we're able to then target and deliver these experiences to the same audience across multiple devices for the same user that may be browsing on their tablet, their desktop, or their mobile device. Adobe Analytics is the foundation or one of the foundational solutions. That's going to help collect all of that data that you're going to then use to gain insight into things. Adobe Target will be used to personalize customer experiences and help you maximize revenue on your web and mobile sites, social media and other digital channels. Audience Manager can be implemented to help bring and build unique audience profiles. So you can then identify your most valuable segments and use them across any of your digital channels, help improve ad spend and things like that.

Cameron Cowan: (43:54)

Awesome. Yeah, I think that as I started thinking about—what are all the different, kind of, core pillars of what, not only what we do here at ObservePoint, but really as a unified technology ecosystem, what are we trying to help people do? A lot of what we talk about is around ensuring data quality, making sure you can trust the data. So there's action to be taken. I think what you just described there was really optimizing all of those customer experiences, whether they happen through the advertising ecosystem, through your marketing automation and customer outreach, through the web or app experiences, being able to use that data to optimize those customer experiences. For me, ultimately it all comes back to those business objectives. You know, we're trying to drive growth and ROI. We're trying to help customers. Our customers and brands really nailed their business objectives, whatever those business objectives are—growing top line growth, growing bottom line profitability.

Cameron Cowan: (44:45)

And so as I started to rethink just the title of this webinar in general, I started going through different iterations. Well, yes, we want to provide a framework for actionable insights, but really we want to be able to provide a framework for actions. You don't just want to stop at the insights. As you said, you want to be able to take those actions through all the various channels in which people experience interactions with your brands and optimize them and make them better and more meaningful. So you can build those long-term relationships. And ultimately for me, it's not even about taking the actions. Those are a means to the end. Action just for action's sake isn't going to get us to where we want to be. So from my perspective, and to kind of end off the webinar, we really want this to be a framework for achieving those business objectives, bringing us back to where we started, no matter what type of organization you are, whether you're a football team trying to win a championship or you're a private space company, trying to send people to Mars or you're any number of digital or hybrid brands that are just trying to deliver great experiences to your customers.

Cameron Cowan: (45:47)

This can be done through building that foundation, knowing what your end objectives are at the beginning, planning that into the process from what you're going to create a measurement program, being able to implement and follow best practices on how to go about that process, and then continually iterating on that as you learn and gain insights, but also as things deteriorate and change and are fluid, being able to run testing and automation throughout the process. So you can make sure things are always in alignment, going back to those core business objectives. I'll leave you all with one final thought and that's this. The best time to plant a tree was 20 years ago. The second best time is now. I think a lot of us, we don't, we are the ones that designed the program that's in place, but we are responsible for getting it to where it needs to be to align to those business objectives. So even if you were part of the original planning, you were part of the planning today, even if you weren't part of the original implementation, you're part of making sure that implementation stays best in class and best in breed. And I know that as we follow these types of frameworks, that as we put in the rails, the safeguards of testing and automation and validation that we can truly get to actionable insights that will lead to meaningful actions that will ultimately help drive our business objectives because they're delivering excellent experiences to our customers.

Cameron Cowan: (47:05)

Alright, Aunica I think we're pretty close to where we were hoping for on time. Do we have time for Q and A, or are we going to just do this offline with those that ask questions? I don't know that I can hear you. Are you on mute still?

Aunica Vilorio: (47:23)

Sorry. Yep. I am sorry. I think we might have time for one or two questions. There's one question that I, we should address, which is from Ashami Desai. I'm sorry if I butchered your name. So it's—how does SDR requirements change to reflect the AEP web SDK framework? Matt, Cameron, or Chris?

Matt Crupe: (47:43)

Yeah, I can take that one. So with the AEP web SDK I don't think the SDR portion should change too much. So the SDR is typically very focused on your analytics. You know, the analytics variables that are getting collected and how each one of those is configured. And so this shouldn't change too much with the AEP web SDK, because—so what the AEP web SDK is, for anybody that's not familiar, is it's our new way of deploying the individual Adobe solutions. So previously, if you were using analytics in your implementation, you would deploy the app measurement library using, either doing it through Launch or we would typically just use an extension. And then you would have another extension installed for Target, another one potentially for Audience Manager.

Matt Crupe: (48:50)

So each solution would have its own extension. if you're doing a more manual implementation, each one of those would be a separate JavaScript file that gets loaded within your application with the AEP web STK. It's a single JavaScript file that is used to deploy any combination of our solutions. So if you're just using Analytics, you can use it to just deploy Analytics, but the value really comes in the multi solution implementations because now we have a single tag, a single request that's going out, or being made as opposed to multiples. So a tag and a request for Analytics and Target and Audience Manager. So it really streamlines the implementation and is a big improvement from a performance performance perspective as well. So coming back to the original question, the SDR is typically specific to your analytics implementation, which really shouldn't change too much with the web SDK.

Aunica Vilorio: (50:02)

Awesome. Thanks, Matt. And then the second question that we'll just go to before we end is what are some of the benefits you've seen from identifying business requirements before you implement. Cameron? Do you want to address that one?

Matt Crupe: (50:15)

Yeah. And Matt addressed this in his discussion, but really it's —the business requirements are the connective tissue between the business objectives and what actually gets implemented. The requirements in my mind are a series of questions that you want to be able to answer.

Cameron Cowan: (50:28)

And if the underlying data doesn't help you answer those questions, then you probably need to rethink the actual deployment of your implementation. I think a lot of people jump straight to getting code on page, or let's just get the basics up and running. So they think, okay, I'm going to set the base code and then I'll update the implementation over time. And really having a well thought out series of business requirements—those questions that help you understand am I, or am I not hitting the business objectives for the whole reason I'm making this investment in the first place—that really kind of brings everything together. I don't know. Matt, would you add anything to that?

Matt Crupe: (51:02)

Yeah, I couldn't agree more, Cameron. So I think you've done a good job summarizing the importance there. I would just echo that statement where if you just jump into, you know, like you said, getting code on the page and trying to just start tagging things, you're gonna end up with that messy implementation, you know, your variables aren't going to be used efficiently. You could end up using more than you really need to, not, you'll end up not satisfying requirements as efficiently as you could, and you'll end up missing requirements. You know, you'll end up with gaps in your implementation and your data because you're not coming back and you're not having that documented kind of holistic view of why you're doing all this to begin with.

Cameron Cowan: (51:57)

Yep. And one other thing I would add to that, I think there's also this component of understanding. If you understand what the business objectives and requirements are, you may find that what you need to answer those questions actually may lie outside the scope of the implementation. There may be data from other systems like a CRM platform or marketing automation system that you need to bring in, in order to fully answer those questions. Part of it is answered through your implementation and what you've implemented in web analytics, but to get to the real business objective, it may require integrating other systems and data sets. And you wouldn't know that unless you've taken the time to think through that process and those requirements.

Matt Crupe: (52:31)

Yeah. That's a great point.

Aunica Vilorio: (52:43)

Thank you guys all for joining—everyone that's in attendance as well as our three presenters, we really appreciate your time and all of your insights that you provided, and we will be sending out the recording tomorrow. So look forward to that and let us know if you have any other questions, but thanks again, guys. I appreciate your help.

Cameron Cowan: (53:01)

Thanks everyone. Bye. Thank you.

 

No Previous Videos

Next Video
Demystifying Data Storytelling: An Expert Panel Discussion
Demystifying Data Storytelling: An Expert Panel Discussion

Join us for a virtual conversation and Q&A session with Lea Pica, Brent Dykes, Bryant Hoopes, and Peter Net...