Agility and Component-ability in Analytics in 2018

November 6, 2017

Agility and Component-ability in Analytics in 2018

Slide 1:

It’s great to be here. And let’s get right into it. This is a presentation largely about the practice of analytics and what works for me at Yamaha. Hopefully you’ll get some insights or “aha!” moments or things like that that will help you in your practice as well.

Slide 2:

As said, I’ve been a marketing analyst at Yamaha Music USA for the last three years. In that capacity, I have basically been a one-person wrecking crew. I was born in Pennsylvania, moved to California about 15 years ago. That is the aforementioned precocious eight-year-old son, and everything about him you can see right there in the screen. I want to thank everybody for attending and learning together as an organization and as an industry. Not a lot of other industries have this built-in learning component to it and that’s a big key. It’s one of our big, important benefits that we have. I’m going to get into that in a little bit as well.

Slide 3:

A year and a half ago, Stephan Hamel, probably one of the best analysts in our entire industry, wrote a manifesto for radical analytics. This really blew my mind. In particular, I found that he linked to the 12 principles behind the agile Manifesto. This really gave me some serious pause and reflection on what is important to work on as an analyst and what’s fluffy. But of course, they’re for software, and we’re not software developers. So, I thought I’d just go into the article and rewrite some of the 12 principles that they have listed on their page. On the first one, our highest priority is to satisfy the customer by delivering valuable software, we could probably change the word “software” to “insights,” things like that.

Slide 4:

Not a lot of other changes. There you see the manifesto header right there. Some obvious ones—number seven I found interesting because, as mentioned in the intro, I think I do is pretty much 90 percent of analytics work are the bread and butter of what we do: tagging properties, data management, data presentation, recommending insights, and integrating with the rest of the organization. In space travel, something as nominal, or something referred to as nominal, it means that everything is moving along—that the rocket is going up in the sky, or is proceeding in a particular direction, but all systems are functioning normally. I thought that was a very interesting analogy for what we’re trying to do as analysts. If we’re running nominally, we should be able to launch the rocket, orbit the planet, go to the moon, come back home safely.

The other one that I found most interesting was number 11. That one says the best architecture, requirements, and designs emerge from self-organizing teams. This is where I want to focus today because that one sentence—the best architecture, requirements, and designs emerge from self-organizing teams—that’s a lot of promise and potential and pitfalls and a whole bunch of possibilities that require a little bit more in-depth investigation. What I’m going to focus on is adopting modular analytics architectures and how that works for us in our particular industry.

Slide 5:

An architecture is—like in software development—it is the product and the process combined together. A product architecture is product design, specification, and all of their components. A process architecture are the activities of all of that. Then you have knowledge architecture, which kind of puts them all together. It’s all designed to be one functional unit made up of many little components. Very similar to a masonry foundation as we see here, or a building, which is the basis of building a lot of houses, especially on the east coast of the United States, but pretty much anywhere in North America.

Slide 6:

I wanted to investigate further what our products are as analysts. Moe Kiss—who’s sister Michele is also presenting somewhere around here today—had a really good stack, if you will, of the things we do as analysts. We do tag properties, we make sure they are tagged perfectly. We make sure that the tag and the pixel are firing at all times, and in the right location, and transmitting the data that eventually makes its way into a data set, a well-managed data set. Then we focus on gathering that data, cleaning it, organizing it, making it readily available when other need it in the organization. And that the data is also well-tested and is tested and not putting garbage in data, just to get garbage out analysis of that. T

hen of course, we present data, and that could include making a PowerPoint deck or building a dashboard of KPIs for stakeholders. Or maybe it’s speech or maybe it’s a well-written email. Just a conniving argument that we need to be a data-driven organization. There are lots of ways to present data I’ll get into that a little bit more, but there are different parts of that that make up the product data presentation. Recommendations as insights, and how we actually build those insights and how we actually communicate and transmit the insight and how it’s received by the party. That’s a critical part of our jobs as analysts.

And finally, integrating with the rest of the organization. That means with our own teams, with other department’s teams; finance, IT, accounting, product developers and so on and so forth.

Slide 7:

And the we have our processes as analysts. Sometimes they look like this or they feel like this. I’m going to dive into what our processes may be as analysts.

Slide 8:

Property tagging is using the tools as best as possible. We are all generally using a tagging tool to manage all of our pixels that are firing and so on and so forth. But what I’ve found out is that property tagging also includes “playing nice with the rest of our tech stack.” Every analyst needs to know what their tech stack is. The whole thing. I’ve found tag interference that’s taken place because one slice of the stack doesn’t necessarily quite work for the other. This is Microsoft’s tech stack, just as an example.

Slide 9:

This is Cisco’s tech stack.

Slide 10:

I hate to tell you, this is Yamaha’s tech stack. We don’t have quite the pretty graphic that demonstrates how it all works together. In investigating and getting ready for this presentation, we’ve found that practitioners are kind of in this place where we’re not quite at the beautiful graphic stage, but we’re investigating that: “what does that actually work with?” and “Why do we this extra too that we’re paying $35.95 a month for?” and so on and so forth. This is very typical when you’re first starting out and I have to acknowledge that some companies are in this stage in defining what their tech stack is.

Slide 11:

The next one is data management. I read something from Barbara Dinter, a professor of business intelligence systems at Chemnitz University of Technology in Germany. If we build data sets in A, B, and C, and then we functionally gather them, clean them, make them available, integrate them, everything else, Professor Dinter says that Big Data and analytics and innovation and management converge and that there are many challenging research questions. There are not actually a lot of those research questions that have been solved quite yet. There’s lots of opportunity for experimentation and seeing what works. But there is not one set tool on how to manage our data.

Professor Dinter also recommend, in some cases, that by using open data sets and open innovation across the organization, you can foster insights and get better ideas that way. Bu that also may be something that not everybody in the organization is quite keen on doing. Those internal realities have to also be considered as well.

Slide 12:

Then data presentation. That’s dashboarding and data visualization. That’s Miss Pica over in the other room right now, who’s presenting at the same time I am. In all of the models I’ve seen, she really does have—in her model of P.I.C.A: purpose, insights, context, and aesthetics—you do get one of the most modular models of data visualization. After you’re done here, by all means check that out. It’s the best model I’ve seen so far for data viz.

Slide 13:

And for insights, actual analytics insights sound like the worst in MarTech speak. It sounds like a lot of hipster garbage that you might here now and then. Of course, that has to be avoided. If you go back to Hamel though, always proposing is an insight. Offer AB alternative variable, or multivariable Alternatives or ranges or menus of choices. Understanding the role that we play as analysts kind of smashes that down.

Slide 14:

And actually focuses more on two of the graphic representations that I show here. Here we’re just trying to move from data all the way across to wisdom. This is a pretty well-known graphic at the top out there for further explanation. It’s a big complex system, often like a storm that’s coming in off of the ocean, in data management and building insights. And why is this happening or why does this storm happen in this location and not in that location? A whole bunch of processes that are important to investigate and recognize and understand.

Slide 15:

And finally, integration with the reset of the organization, especially with finance and IT departments. That requires some serious integration with the IT team. It isn’t just more than making friends and influencing people, it’s also having an advocate in the organization to rely on best practices for data, data-driven outcomes that actually benefit the entire organization, not just the marketing department, sales department.

Slide 16:

From those products and those processes that we have, we than start to build a knowledge architecture. A knowledge architecture decomposes product and process designs into functional components. Knowledge architecture also understands how those component function and what we’re actually doing in the property tagging or the reporting or the building of insights. It also is defining the interface between the product and the process and doing it. It’s also assuring the component-ability of those sides.

In this case, a product architecture, there are multiple ways to tag properties and I use multiple tools in order to do that. There are certainly multiple ways to manage data. There are certainly multiple ways to report data, especially within the organization. If people like PowerPoint slides or if they like a report, double-spaced—there are multiple methods of doing what we do. And better to acknowledge them for what they are than to just automatically assume that there is just one way of doing things with your product architecture.

Then your process architecture, which is the same thing. Product being practical knowledge and process being how it actually gets done. I then have looked into my organization and saw that how they interface is where the modularity actually takes place. Modularity if mixing and matching these products with these processes within a particular architecture, often referred to as “plug-and-play.” You get this modularity from knowing how, in our case, the practical knowledge of the tool or of the data management actually takes places and then know why. The theories behind why the systems work, why the processes all work. In a lot of cases, we are presented as analysts with high in the sky, one size fits all, everything will work just fine solutions. That’s not necessarily so.

What I have found is that there are some elements of data management that the process can also be used in the process of building insights. They are all interconnected and not only are they interconnected, but the practice of analytics has constraints. And I hate admitting it, but sometimes the red lines between the goal boxes don’t necessarily work. That’s understanding for certain why systems work and why they don’t work. Where you have the green lines in this particular diagram, those are the tremendous flexibilities that you get as a result of something that actually works.

Knowing both sides of what we do and how we do it, gives us systemic flexibility in making sense of the problems that we’re trying to solve. And that’s all we’re really trying to do as decent analysts. We are trying to figure out business problems. Strip all of the other fluff away, we’re in this to solve problems. Because here’s what’s happening within organizations on the practitioner level; we get a couple quick wins, we get a couple of attempts to understand the business problems that actually work and are actually solved. What invariably happens is management recognizes this and they continue to increase the load of demands for the rest of the organization.

So, what I’ve attempted to learn and put into practice is each component of what we do in tagging, or each component of what we do in insight building, or each component of communicating with people and within the organization, with outside vendors, and understand that there are different methods for different outcomes at different times all the time. That burden is continuing to grow.

Slide 17:

The easiest way to bandage those increased demands in my practice are done in about five different ways. The first one is often referred to as “logical incrementalism.” That is not necessarily path-building. The Great Wall of China was built one block at a time. Logical incrementalism has elements of uncertainty to them. When you’re building a wall, you know the wall is going to be 20 feet wide and eight feet tall and you know the exact number of blocks that you have to build with it. And in logical incrementalism there is environmental uncertainty. We don’t know what will happen with accuracy and precision once the campaign is totally analyzed. We do build general goals for ourselves to give us that flexibility. We also build flexibility by experimentation. As time continues to march on, there are emerging strategies and tactics that are working all the time.

You would never build a path this way, you would never build a wall this way. But if you use logical incrementalism, you’re able to look at all the different components that you have in your practice and then put them into use in a very—well, obviously in a very logical way—but also in a very explainable way so that any other person in the organization can readily understand what you’re doing.

Number two: embrace diversity of thought and experience. A great from Dr. Tom Davenport—well known in the analytics world—says that there are four general skills that you have as analysts. You have analytics skills; you have to know how to do the math. You have to understand the business. You’re building relationships and relationships of different strengths and experiences with other people in the organization and outside of the organization. And you’re building communication skills. You’re learning that some people are readers and some are listeners, and readers process information in a different way than listeners do. You can then be able to present the same argument, but in different formats and in different ideas and thoughts and experiences to solve the particular business problem.

Step three would be to prioritize by business value. This is where our friends in software have made great strides because there are fantastic tools out there and these processes are very well-known and have been put into practice, oh a good 25 years or so. This is technology project management 101. How long is it going to take? How much is it going to cost? What’s going to be the return on investment? And knowing each of the components of our practices and knowing where we make the most amount of impact is probably the most quantitative way of building a knowledge architecture.

Satisfy everyone’s curiosities. This is asking the right questions. This is solving the prioritized problems. This is knowing that we truly cannot know for certain. We’re getting better at it all the time and this is the promise and the pitfall of artificial intelligence and machine learning is that we still can’t compute all of the problems that we’re trying to solve. As analysts, if we’re curious about what those business problems truly are, and you leave room in the analytics practice to do that, there is anecdotal evidence, of course, that that will work. We’re building a little bit more of the research into analytics teams being encouraged to think outside the box and really get to understand particular business issues.

And finally, what I’ve learned is you win as analysts if something is more clear now as compared to then. If the numbers are bad that you’re reporting, you’ll win because now everybody knows that. If the sales numbers are great based on your presentation, you win because everybody know that. I highly recommend building clarity-based goals as an analyst in your particular practice. Building them into performance reviews and helping others to understand the role that clarity brings in the business problems that we’re trying to solve. The more clear that you can be, the more likely you are to win.

That ends my presentation. Thank you all for joining today and for listening to me. And I’ll take any questions that you have at this time.

Previous Video
Digital Intelligence: The Strategy For Engagement Success With Data, Analytics & Tech
Digital Intelligence: The Strategy For Engagement Success With Data, Analytics & Tech

Firms must use what they know of their customers to deliver digital experiences during moments of engagement.

Next Video
Looking Forward to 2018 and Beyond: Where the Data is Taking Us
Looking Forward to 2018 and Beyond: Where the Data is Taking Us

Krista covers three areas of focus to make data more attributable, simple, and useful.