Lee Isensee, Search Discovery - Hey Siri, What Are App Analytics?

November 22, 2016

Slide 1:

Today’s session is: hey Siri, what are app analytics?

Slide 2:

Just a quick summary of what we’re going to be learning today, we’re going to quickly cover how did mobile analytics develop into what it is today? Also, what are the key differences between web and apps. What should you be measuring in your applications? What are the metrics for success? Also, you definitely want to make sure you understand what questions should be asked to be successful within your organization as it relates to applications.

Slide 3:

A little bit about Search Discovery, we’re a digital intelligence company. We’re based out of Atlanta with employees all over the globe. One of our prides is the fact that we help our clients use data to drive better decision making. We’ve done this for many years and we’ve super excited about the clients that we work with, many of which are likely on at today’s conference. Just a quick, fun background on myself; I’m originally from Boston, I even have a dog named Fenway, and the newest member of our family, Mini-me, happens to turn seven months old today. So we’re super excited about that.

A little experience background, I’m originally from a company called Unica, who makes campaign management analytics solutions. In about 2010, Unica was acquired by IBM and that’s now part of their larger digital campaign and marketing platforms. Previous to Search Discovery, I was with a company called Localytics. So over the past 10 odd years I’ve focused primarily on the mobile and web analytics market. Focusing on how to leverage that data to be a more efficient analyst and marketer.

Slide 4:

Stepping into the history of how we got to where we are today. Just a little background on mobile, and web itself from the mobile perspective.

Slide 5:

Let’s start with WAP. At its peak, which was about 2004 to 2006, WAP was doing tens of billions of page impressions, or as we previously called them, hits, in the analytics world. These were the days of Motorola Razors, Samsung Blackjacks, Nokia was still your top-tier phone manufacturer. Today if you ask someone about WAP, they either give you a blank stare or assume that you’re talking about the rapper Fetty Wap who was 12 when WAP was most popular. Trying to do analytics for WAP was pretty challenging, if not straight-up brutal. It required a lot of server-side code, there were no SDKs, a lot of custom log files or pre-processing of data.

The audience for mobile at that point was pretty miniscule. The most difficult part was the fact that just to navigate through the internet at that point required using numbers or arrows on your device if it had keys and arrows to go to different links. So if I wanted to, in this case, go to a mobile photo link, I had to hit the number one or hit number two for mobile video. Which I could only imagine what that video looks like on an old device. A lot of analytics was page view based. People were just excited to have internet on their phones, just as we were when we first introduced internet on the desktop.

Slide 6:

Next up is what I call, “Nanoweb.” As the desktop user experience became more enhanced, the expectations for mobile devices grew in step. The desktop web browser company Opera created a new product called Opera Mini that allowed most of what you were able to do on the desktop, to happen on this tiny little phone. Most of those devices had no pinch-to-zoom. That wasn’t really introduced until the iPhone came along. But there was a lot of squint-and-wait. Slow internet connections just to figure out what you could potentially want to look at required scrolling with an arrow to a specific spot, pressing, letting it zoom in, and using arrows again to get to the specific link that you wanted to and wait for the next page to load. While these were mostly functioning browsers, they didn’t possess the same technical features as desktop browsers.

One of the most noticeable differences from an analytics perspective was the fact that the JavaScript rendering was either non-existent or abysmal at best. The analytics implementation didn’t require much extra work, but a lot of emphasis went into ensuring libraries were as tiny as possible in the off-chance that the device did support JavaScript. With the introduction of the iPhone, using the internet on your phone went from tech person parlor trick with the Opera Mini to my mom and dad browsing their favorite websites on their device. What was once a small audience became nightly news about being able to have internet everywhere.

One of the downfalls of this experience was the loss of Flash. Many folks were already familiar with the cheesy flash games that we knew and loved at the time. As well as the best way to crash your browser in some cases. It was the mainstay of web content and most people's’ first foray into apps. But this type of mobile engagement really began in the depth of flash. Concepts that we learn from flash will actually become very applicable later on as we’ll see.

Slide 7:

So the next step would be MWeb. We’re still seeing MWeb today, but really it was driven from users having enough of that pinch to try to read my content. They were tired with having to deal with the little text and all the navigational hiccups. The performance was questionable and slow at best, and users had these new big screens. They wanted to use them. Despite providing a better experience for the user, many sites sacrificed their content in exchange for that experience. In many cases, users were left with top-level content and went hunting for the whole site link at the bottom of each page or menu. Or they had to know that they could change the URL from M dot whatever to www dot whatever. As soon as the user made it back to the full site, they were put right back into the experience that frustrated them previously with having a lot of content and a tiny screen.

This was really when companies started to pay attention to mobile as a fact of life. And they wanted to make sure that their marketing and analytics followed suit. Luckily during this time, the web browsers continued to improve, JavaScript engines were speeding up, and companies were starting to ask, “If we’re already spending extra money on mobile users, should we have a dedicated mobile app?” With this concept of a whole new site just for mobile, the analytics implementations suddenly doubled in time and effort. Now you were tagging the originally full desktop site, while still considering how it impacted mobile users, then having to tag the mobile site, the M site as well.

Slide 8:

Which brings us to responsive design. Many of us are having to deal with that today. So if M web was driven by users having enough of poor navigation, responsive was driven by developers and companies finally having a enough of having to manage two different code bases. It was becoming too much. Browsers were finally advancing, access to information about the devices and layouts was much more improved than it had previously been. And you’re able to provide 100 percent of the content to your users unlike the “half net” like MWeb or Nanoweb. Regardless of whatever device the user was on, they were going to have a great experience for the most part. Analytics implementation suddenly decreased in size, you had a single code base that you could work with, less risk and tagging for analytics, but it was still a web page and you still ran into difficulties of dealing with a mobile devices and similar technical limitations.

Slide 9:

Which moves us into native applications. With the popularity of the app store, it seemed everybody wanted to have their brand in app store, they had to have an app. In the early cases of having an app in the app store was nothing more than just taking your web content and transposing it from MWeb or responsive to a native app. This time though, it was going to require all new programming. No more HTML, no more JavaScript. Now you have to begin with Objective-C and Java, completely different languages.

To give you an idea, Java is to JavaScript as car is to carpet. Just because they happen to share four same letters, or three letters—with two repeated—they have nothing to do with each other. The differences between web and app are too great to list here, but this change in experience had a couple core differences for digital marketing and analytics that needed to be addressed. Analytics implementation in native applications were no longer about JavaScript. They required SDKs. This time technical marketers couldn’t just go view source on the page to figure out what tags were correct or incorrect, it required them to do additional work and have coordinated efforts with development to be able to add and fix codes or code in the app.

Slide 10:

Which brings us to the next evolution, which is apps everywhere. Whether it’s a standalone device, embedded device, I can walk up to my TV right now and post something on Twitter or Facebook or look at my news feed. I can obviously watch Netflix. My TV no longer just shows TV shows or movies. I can engage with it. Obviously we have the advancement of voice applications, so things like Alexa and Siri, being able to talk to your TV through the Apple TV remote. And even my car, I can step into my car, sit in my car, turn a knob, and change how my driving experience is going to happen. My car is driven by applications. Once again, these are primarily driven out of SDKs, but has a unique experience for the user and input methods that we couldn’t even think about a couple years ago.


Slide 11:

So while we’ve covered the history, let’s talk about a couple differences between web and apps. Chances are a good number of folks here today are primarily on the web side of things, so maybe this will give you a sense of what it’s like being on the other side of the aisle.

Slide 12:

What are apps and how did we have to learn to deal with them? In digital analytics back in 2006, when it was so called web analytics, when someone asked you to tag a Flash video, or any Flash content, it was exciting and new. All of the sudden though, when you figured out what was really involved, it became rather frustrating. What happened to my concept of pages? What do you mean things can layer on top of each other? Why does a user need to go back and forth between the parts of the Flash app? These were closed-source web files and you had to use a new language called ActionScript to make calls out to what you used to work with in JavaScript. It was like fitting a square peg in a round hole from an implementation standpoint. Non-page view events became your new best friend and time had a whole new meaning. No longer were you measuring the start and end of a session as time between pages, it was about the very heartbeat of the user experience. How were they consuming video? Were they playing, pausing, scrubbing? How do I measure that? It really was a change of mindset for many analysts.

Slide 13:

How are they different? Some of these might look like semantical differences, but really it’s web versus app and you look at how you can understand the user, where they are engaging or how they are engaging with each. In the web we know we have our traditional structure of a visitor, who has a visit, who has one or more page views, and could have one or more events. In the app world, we have a user. We have an individual that we are easily able to identify. We also have sessions. While sessions and visits might sound the same, we’re starting to see the analytics vendors move from the concept of the visitor to sessions as their descriptor. Then one of the key differences, or probably the key difference, is how we’re separating the engagement of the user. As I mentioned, you can have one or more page views for a visit and within that page view you might have multiple events. In the app world, we completely separate actions or events with the state that the user’s in.

If you imagine Facebook as an example, when you launch the application, you’re immediately presented with the news feed. Sure we can probably tag that as a view, but you’re scrolling, and you’re scrolling, and you’re scrolling. When does that next page view happen? In reality, you’re in the state of a news feed and you have multiple actions you can take. I can like, I can comment, and I can share. It’s much more relevant to understand the engagement of the user based upon their actions or events, than it is necessarily what page or perhaps even state that they’re in.

Slide 14:

To drill into this a little further, we definitely want to understand what are the technical components to this. So we know the web has JavaScript and Objective-C or Java for apps, but we get into identification. On the web we have cookies for the most part and sure, users can authenticate, but for the most part, every user is going to get a cookie. And that’s how we know who the user is. On mobile devices, we have an advertiser ID. This is inherent to the device. You launch an application and that application knows an ID about your device. And for many applications, identification goes beyond even that ad ID. For me, most of the applications I’m using have some form of social authentication. I get value using the application if I authenticate with Facebook or Google or Twitter. Or if it’s a game, at least even game center on my iPhone.

Now the developer knows, not only who I am on that device, but they can track me across multiple devices. Something that on the web side of things we’ve been trying to get to for years and years. Another key difference here is session duration. On the desktop we know we have 30 minutes of time out before the session ends. And that’s been around for years, but now folks are using multiple browsers, which runs into the cookie issue, they have multiple paths, which could distract them from the content they’re viewing on your site. On a mobile device, you have one app open at a time. For someone to be away from that application for 30 minutes or to even be in the application for 30 minutes is very unlikely. If I’m viewing the app and I jump out to view an email and I come back, my train of thought hasn’t been lost. My sessions doesn’t need to end. But if I’m in the app, I view email, I respond to a bunch of emails, maybe I’m going for a walk for a minute or two, I come back to the app, I may have lost track of what I was doing. So we’ll go ahead and end the session there. As we mentioned the differences between see and touch, there’s a direct overlay between touch and see on the web, whereas on apps, they’re completely different.

Slide 15:

This is an ObservePoint conference so we definitely want talk about analytics testing and validation. Something to keep in mind, app validation and testing is not as easy as the web. On the web, we know can go to view source, see exactly what’s going on, we can look at a console, mess with a couple variables, make a couple of JavaScript calls, and we can figure out what’s happening. On an app side, you have a closed-source application. You’re going to have to become friends with your developers again. They’re going to need to build debug versions of the app or give you special builds. You’re definitely going to want to use a vendor application to do testing, if available.

At Search Discovery, we do a lot of work with Adobe, and one of the most valuable applications that our team uses when we work with clients with mobile apps, is the Bloodhound app. We can easily connect the device, we can proxy over and see just as how Adobe would. While there are more technical solutions like Charles, or even being able to use Xgood or Android Studio, they will get you the information, but they’re not going to give it to you in the user friendly way that an analytics provider would.

Just a couple tips on testing, at Search Discovery, we do work with numerous clients and we have a couple painful stories or painful situations that we’ve had to live through. For using devices, not simulators, this is a great example where we were working with the client and their minimum app version was IOS 8. And that’s great to set a minimum so that folks using super old versions won’t have horrible experiences. But when development did their initial analytics testing, they tested against IOS 8, but in reality, 90 percent of those user were on IOS 9 or later. When we did the test on what most of the users were running, on physical devices running those OS versions, it wasn’t working. It just wasn’t the same. So you definitely want to make sure that you’re testing on what the most relevant user base is, and on physical devices. And of course, you have to assume that at some time or another that your users are not going to be on the internet, so you definitely want to cover that base.

Slide 16:

A couple things to measure, and we’ll go through these quickly because there is a lot of content and you can all read.

Slide 17:

If we group these by how an app store categorizes apps, we’ll first look at lifestyle and travel. Lifestyle and travel is comprised of, for example, Triple A, Philips Hues, or Zillow. There are a lot of events that you can be tracking. Now obviously, you want to know about opens and spend credits and checkout initiated, but the four that you definitely want to make sure that you’re keeping an eye on—because they best represent how users are engaging with your applications—are registration. Are they sharing content? Are they inviting other users to the app? Or are they even purchasing certain memberships?

Slide 18:

If we move on to business and finance, that would be your Slacks, your Autodesk applications, your Chase banking app. Once again, open and login are important, you want to understand if people are adding their payment information. Are they sharing documents or are they commenting or communicating with others on Slack? So definitely a couple key components because they’re going to best represent if someone cares about your application and is finding value out of it.

Slide 19:

And finally, we have education and games. For example, we’ve got Duolingo and Minecraft and Pokemon Go. These are pretty straightforward. Are people achieving the level you want? Are they getting trapped in specific portions of the applications? Should you be providing more tutorials to get them through those levels or perhaps providing additional credits?

Slide 20:

Then we have our metrics for success. If we’ve tagged our applications, we want to make sure we are doing the ABCs of marketing.

Slide 21:

We want to make sure our acquisition is on point, our behavior is exactly how we want to understand it based upon those events, and then what is our conversion points. If we look at lifestyle and travel, how many days before that first purchase? Or for the behavior, are we seeing the appropriate amount of return to ad spend? If we’re spending 100 thousand dollars to acquire those new users, are they actually making purchases in the app or doing conversion events that drive value. And then conversion itself, are we getting those recurring subscriptions? Are we seeing specific campaigns from our acquisition efforts being more effective or less effective than we wanted?

Slide 22:

When we get into media and entertainment, it’s pretty straightforward, it’s primarily ad driven. If someone’s streaming audio or viewing TV shows or movies, are different advertisements from, perhaps cross promotions working? Are they effective in driving the users? Are there specific retention points. Do we know that if someone watches 20 minutes of a 30 minute show that they’re truly engaged, or does it only take two to three minutes for them to become fully brand and app loyal? And then conversion, is our ad delivery actually driving revenue or value? Or is it getting in the way of our users enjoying content? Do we need to pre-roll or should we be looking to have paid subscriptions?

Slide 23:

This is perhaps the most easiest [sic] to associate with, which is our commerce and retail app metrics. We all know we definitely want to drive more purchases, and we definitely want to make sure that we’re getting the new users from the appropriate channels.

Slide 24:

A couple last things to consider as we work through our applications and try to drive change in our business.

Slide 25:

So here’s a metric for you: 20 percent. That’s the percentage of users who only open an app once. We want to make sure that our users are getting enticed by our application immediately and that we’re understanding what they’re doing in those first couple minutes using our applications so that we can target them as effectively as possible to make them a returning user.

Slide 26:

On the point of returning, 60 percent. 60 percent is the likelihood that an app user doesn’t return within seven days, will never return again. You have to be able to use your analytics data to re-engage those users. Simply assuming because they downloaded the app, and opened it once, doesn’t mean that they’re going to come back. You have to being creative about the actions and segments of users who do return, being able to leverage technology like in-app messaging, and push to drive them back. Assuming that your old technology, such as email, is going to be effective at driving them back to the application, might not best represent how they want to interact with your brand.

Slide 27:

And next: 80 percent. 80 percent of users, when they’re using the application, only use their top three apps. So if someone is in the application, 80 percent of their time is only going to be spent in three apps. What can you do to be one of those three apps? Are you going to be one of those apps where they spend two minutes? Maybe that’s ok for your application, but you definitely want to understand how your business related and what the expectation is for the users.

Slide 28:

So six or seven questions for you to be asking as you engage with your business around what to do in application.

Number one: can I integrate app activities across channels? We know from the web world, we definitely want to integrate our CRM data with our web usage. Here we want to make sure that when we look at our users, are we looking at them from the web and app world or are we isolating those into separate channels?

Second: how much does it cost me to acquire my users? Am I flooding Facebook and Pandora with marketing spend and getting nothing in return? If I’m spending 10 thousand dollars and I’m only seeing one thousand dollars in return, is that what your company’s expectations were?

Which users have the highest lifetime value over time? This is great going back to how we identified users because we have a much more consistent way of knowing the who, rather than just a cookie.

And then: can I engage with my users in a timely fashion and a highly personalized way? If I know I’ve done specific actions, if I know where they are, if I know what campaign they came from, I can use that information to deliver the right in-app message or push message.

Slide 29:

Five: we have which patterns are my best and worst users doing that I can influence? Do we know that our worst users might be using the app, but they’re extremely frustrated and they’re only finding the functional value rather than the content itself? Can we provide specific engagement tactics to understand what is most successful or what is not most successful in our application to drive product change and marketing change?

And then, while the market does seem slim for A/B testing in mobile, are we able to use those actions and events that users are taking in the acquisition channels to do some A/B testing? Have it be data driven testing rather than testing for the sake of testing.

And finally: which key triggers and user actions are driving conversion? Do we know those behaviors?

Slide 30:

So just to wrap up, we do want to talk about a couple things that we covered today. How did mobile analytics develop into what it is today? What are the key differences between web and apps? If you remember, there are some semantic sounding changes, but in reality, there are pretty fundamental differences. What should you be measuring? What are the events that are most relevant to your category of application? And what are the metrics for your success? Just because you’re tracking the right things, are you asking the right questions that can help drive your business?

Slide 31:

With that, we do have an opportunity for you to ask questions. Within the webcast, there’s a Q and A section. Feel free to ask those there. Additionally, in the pavilion or sponsor section, we will have a booth. Feel free to ask questions there as well. We’ll have plenty of folks from Search Discovery there that are happy to answer any questions. This has been a great event so far. I know some of the earlier keynotes by Rusty Warner and Susan Vertrees have been fantastic, a lot of folks were excited about those. There’s also a lot of great keynotes later. I know Adam Greco’s presentation is highly anticipated. And of course, Brent Dykes, who a lot of folks know is always a great presenter. I’m super excited to be able to catch those. With that I want to thank you for joining the “Hey Siri, what are app analytics” session. Thank you, I look forward to your questions.

Previous Video
Daryl Acumen, Hewlett Packard Enterprise - Keeping Your Analytics Clean and Consistent Across Business Units
Daryl Acumen, Hewlett Packard Enterprise - Keeping Your Analytics Clean and Consistent Across Business Units

This session covers threats in maintaining analytics across business units and demonstrates why it is criti...

Next Video
Matt Maddox, ObservePoint - Advanced User Journeys: Best Practices and Troubleshooting Tips
Matt Maddox, ObservePoint - Advanced User Journeys: Best Practices and Troubleshooting Tips

This session offers tips for maintaining User Journeys and for jQuery or JavaScript actions.