Michele Kiss - Pairing Analytics with Qualitative Methods to Understand the WHY

November 22, 2016

Slide 1:

I’m really glad to be here at the Analytics Summit. We’ve had some really great speakers today, so hopefully I can keep up the great trend. I’m going to be talking about some qualitative methods you can use to augment your analysis.

Slide 2:

A little bit about me: I’m actually a displaced Australian, currently located in California. I have a couple of fur-babies and a regular human baby that all enjoy there interwebbing as well. I like to think they take after me.

Slide 3:

I’m a Senior Partner at Analytics Demystified. Most of you in the industry probably know some of my partners: Adam Greco, Tim Wilson, etc. And I know some of them have spoken today too.

Slide 4:

What I’m here to talk about, is sometimes we forget, in our really quantified world, a little bit about the why. What lies behind the numbers?

Slide 5:

If you think about a typical analytics report, what it’s really telling you is what happened. Site traffic went up or down, or traffic sources shifted, conversion rate declined, etc.

Slide 6:

And those things do and should lead a good analyst to start thinking: “Why is that taking place?” Sometimes we may try to answer that using our data, but other times we’re going to need to use different methods.

Slide 7:

We ask a lot of questions to understand user behavior. It starts with a simple, “What are people doing?” For that, we might be leveraging web analytics, or social, or mobile analytics. We may use AB and multivariate testing to understand what performs better. Also, understanding why users do and don’t do certain things. That’s why some of the methods we’re going to talk about today come in. Looking at surveys, user testing, session replay, etc.

Slide 8:

I like to look at it as playing a little armchair psychologist to understand the why.

Slide 9:

The plural of anecdote is not data. I’m not saying replace your great digital analytics data with a couple “for instances,” but really what we’re talking about is harnessing multiple methods to get a more complete picture.

Slide 10:

After all, when we’re talking about human behavior, it’s an art and a science combined. These different methods are really going to help you do a better job of piecing together the why.

Slide 11:

What I’m really talking about is how we can use the power of data to understand how big a problem it is, as well as commentary to understand how people feel about it. That’s really what’s going to allow you to tell the difference between a problem that affects a really small percentage of your users, but maybe it’s really making them angry. Versus, the flip-side might be that there’s a problem that affects a lot of users, but it’s actually no big deal.

Slide 12:

If you were presenting information, which of these do you think would be more persuasive?

Slide 13:

Same information. 18 percent progress from step one to step two of our signup flow.

Slide 14:

But if we were to present it in this way, with qualitative information layered in, we would see some of the things that are making people angry about what we might be asking on step one. That helps us put that 18 percent in context.

Slide 15:

These methods are pervasive in other fields.

Slide 16:

If you think about the medical field, and the scientific community.

Slide 17:

Yes, they use a ton of quantitative data, but they actually use qualitative methods as well. For example, gathering patient report history to pair that alongside and understand an empirical test.

Slide 18:

Journalists will do this by combining statistics together with quotes.

Slide 19:

The legal field will take ballistic or blood spatter or DNA as evidence. They have that data, but at the same time, the beginning and the end of each trial is reserved for them to weave that qualitative narrative.

Slide 20:

Let’s say you’re sold.

Slide 21:

How can you better sell your finding to others? And how can you use these methods?

Slide 22:

Some of the methods we’re talking about include surveys.

Slide 23:

One possible use of surveys would be to have surveys running alongside your AB test. It doesn’t have to be a specific question about what you’re testing. It can be something much more general. It might be as simple as: what are you hoping to find on this website? Looking at the answers for the controlled versus the redesign, it may help us to understand why the redesign performed better.

Slide 24:

Exit surveys. You can use them to get qualitative feedback as your users leave the site. And perhaps marry that with bounce rate or exit rate data to understand why they were doing it. Most of us have been exposed to these; you kind of have to be living under a rock to never have seen an exit survey.

Slide 25:

Post-conversion surveys. If you’re trying to understand: why do my customers convert and why don’t they? Talking to those who do convert can be a really valuable way to find out that information. What pushed them over the edge to purchase?

Slide 26:

On the flip-side, fall out surveys. Why didn’t those people purchase? What was it that meant that they weren’t going to take that step? Using that data together with your conversion funnel to identify that for optimization.

Slide 27:

You can also monitor satisfaction scores over time. It might be used for analysis, for example, if you’ve been doing a long-term rebranding exercise, to understand how satisfaction scores changed during that time. Or this may be an additional layer that you add into your ongoing performance reporting just to understand the data that you’re seeing.

Slide 28:

If it’s qualitative feedback that you’re interested in, why not give your users the opportunity to give it to you whenever they want to? There are lots of tools that make it possible for users to give feedback when they choose, rather that you having to bother them with a popup.

Slide 29:

It really comes down to: you can guess all day at why somebody did what they did, but sometimes it’s easier to ask.

Slide 30:

An example of the kinds of things you can ask: Avinash’s “Three Questions” is an option. What is the purpose of your visit? Did you accomplish what you came for? And if not, why not? And you’d be surprised the valuable insight that you can get just from those three simples questions.

Slide 31:

Some other options: would you recommend this to a friend and why? Why not? Did anything frustrate you on your visit today? And if so, getting them to explain why. Or what did you like best or least about your visit?

Slide 32:

Keep in mind, we’re not trying to do this. Nobody wants to fill in a long survey. The goal isn’t to get yet another set of data. The goal is really to get the voice of the customer. I would recommend sticking to minimal questions, first of all, just to get that little bit of information and you’ll probably also get more people completing it.

Slide 33:

Your goal, remember, is to get qualitative information. I do recommend leaning towards more open-ended questions that give your customers the chance to speak their mind.

Slide 34:

Another method that analysts can leverage is user testing.

Slide 35:

User testing can run a range from formal user testing to remote, informal user testing, and have a number of different methods. A few of those we’re going to talk about today.

Slide 36:

One possible method is focus groups. In a focus group, researchers gather a group of people, and they ask questions and use the group discussion and their interactions to understand the perspective on the research question. I got to take part in a focus group when Adobe first acquired Omniture because they were looking to understand the view of the different brands, the view of the acquisition, what users, like myself, though about it. Keep in mind: we’re not saying to make the focus group your source of truth. It may not be a representative sample, people may not be as willing to share 100 percent openly as they would in an anonymous survey, but it’s a really valuable tool to complement your digital analytics data.

Slide 37:

Another method would be observation and task completion. This might be what you think of when you typically think of user testing. You can watch users interact with your product, give them a certain task or a number of tasks, and see what steps they go through to complete this. Think of it as live, you are watching your conversion data. You can also make sure the user experience stays really well balance, so you don’t have, for example, learning effects or privacy or latency effects. They balance these questions very well. These can be remote, they can be in person, there’s a wide range of ways that this could look.

Slide 38:

If you’ve never got to sit in on this type of thing, I would recommend checking out usertesting.com. They have a video that shows you a live remote user testing. You can hear the narration of what the user is doing, and explains what he’s interested in, what he found and what surprised him.

Slide 39:

An example of how this may really complement analytics data. Take a tire company. This tire company has a website, and they have all of their tires advertised. You search and it tells you: these are the options for your car. Then, you can click and you can request an appointment or you can click this add button. It takes you to something that looks a little bit like a cart. From an analytics perspective, this would probably look like a pretty successful conversion. Somebody searched for a tire, they found the tire, and they clicked the add button. But, some observation and user testing of this experience reveals that people are actually really frustrated by this. What they found is that they’re asking questions like: Where do I put my credit card in? Why can’t I buy this? And actually this ecommerce looking thing was a quote. What you needed to do was is to print the quote out and go into the tire company in person. Analytics would’ve said this is a successful visit because the online conversion happened. As much as could happen online, happened online.

Slide 40:

But, when you take it and get that extra layer of detail, you find out that the best thing they could do is make it much clearer what this experience was. Changing the button to: get a quote. Changing the actual quote page to be clearer about what it was representing and what the next steps were.

Slide 41:

Another method—this can be really useful if you’re trying to understand content analysis—is card sorting. Allowing users to put content into different buckets. They can define buckets or you can define it. Because let’s say you have some really great content on your site, and you don’t understand why people aren’t reading it, or why it gets such low traffic. Card sorting might be something that reveals it’s actually not in a logical place for where your users are expecting to find it. That might help with information architecture and actually help drive your analytics numbers.

Slide 42:

Another option is actual prototype testing. AB testing is great and awesome and allows you to test your entire population, but prototype testing allows you to test concepts before they’ve been fully developed. That might be an early working version or it might even be something like sketches or wireframes. I participated in some of this prototype testing with a company called Whistle; they make a doggie activity monitor. They were redesigning their app, so they showed screenshots, and they would say to me, “Okay, what would you do here if this was the next thing you were looking to do in the app?” My feedback and the feedback of others helped them actually improve their design and find the right calls to action based on what was more intuitive for the user.

Slide 43:

The real value comes from pairing those together with your digital analytics data. I had a client that was looking to build a lifestyle app and they recruited a group of beta testers. The beta testers used the app for four weeks, and they received surveys during that time period. Now, a combination of Google Analytics within the app and their survey data, allowed them to understand what people actually did and how that differed from what they said that they did from the self-reported information via the survey. They could understand how people use the app, whether they would continue to use it, and also they were looking more generally at how it affected perception of the brand. The coupled data was really powerful because features that users claimed were important to them, they actually weren’t used. This allowed them to do some follow-up research, do some additional user testing, to find out if this was an information architecture problem, or if what people reported was important, their behavior just didn’t reflect that.

Slide 44:

You might be thinking, “How can I do more of this? How can I get involved?” Obviously, you don’t have the time, and you’re not going to become a fully dedicated UX expert.

Slide 45:

But, I would recommend getting involved if you have a UX department in your organization, or maybe an agency that does this. See if you can sit-in, from time to time, and watch some of these user tests take place, or recommend ideas for user testing.

Slide 46:

You have the option to do some remote user testing on your own. That doesn’t mean you have to take over user testing for the entire organization, but having access to something that would allow you to do remote user testing would allow you to look at AB tests, or get some ideas for analysis.

Slide 47:

Most of the time, the response that I hear is, “We don’t have funding for that kind of thing. There’s no way I can budget, even if it’s a small amount.”

Slide 48:

One of the things that I do recommend is just trying some DIY user testing. That might mean recruiting your family or your friends for help. Ask them to perform behaviors on your site or you app and record it, or take notes, or use screen sharing software to actually record what they do and analyze it later. If you have a user experience team, I would recommend asking for their help to set up the tasks and the questions and the best way to do this. But keep in mind the holidays are coming up, so I figure you can let this distract you from any awkward family gatherings.

Slide 49:

You can also have a little bit of fun with it.

Slide 50:

If you’ve never seen drunk user testing, this is an actual thing and it can be incredibly entertaining.

Slide 51:

Because, like I said, holidays are coming up. Just saying.

Slide 52:

Don’t forget, if you are going to be doing some user testing or DIY user testing, it’s not just about getting a bunch of random people. It does matter who you’re talking to and hearing from. If you were concerned about a loyalty app, you’d want to talk to loyal users. If you wanted to know about the initial sign-up process, you’d want first-time users. Just as segmentation is important in analytics, reaching the right segment is important here too. If you design an app that’s intended for retirees, your seventeen year old cousin is probably not going to be the best person to be giving you feedback about usability.

Slide 53:

Another option that’s sort of in-between the two is session replay. I like to think of this as if your digital analytics and user testing had a baby, this might be what it looked like. There’s lots of vendors here and they’ll either offer the ability to replay and watch individual user sessions, which is incredibly time consuming. Or they will actually roll that information up into heat maps. It can be really interesting and helpful to understand how people convert to a form, how they view a page, what they attend to, and just going beyond tables and charts of data, but to actually get a little more live action view of how people engage with your site, and why you might be seeing the data that are.

Slide 54:

Another great source of qualitative data is actually your customer service data. Pretty much every company has some kind of customer feedback process. It might be phone calls, it might be an email address, it might be a comment form. Reach out to the team that’s responsible for that, and see if you can review the information that they’re getting. Or if they can give you a heads up when they start to see more of a certain type of comment.

Slide 55:

Another great option that you can jump in at any point is actually social listening. I’m not talking about building a huge social listening program or investing in extensive tools, but just going through and eavesdropping on comments on your brand’s Facebook page, or reading through mentions of your brand on Twitter. Especially if you’ve done something big like: change your pricing structure, or completely change your app or your website. You may find a lot of comments about that that would help you understand the redesign data that you’re looking at.

Slide 56:

Another common thing in the user experience world is personas. Pretty much every business has a persona of, “These are the type of customers that we have.” A way that you can tie quantitative and qualitative is to work with your user experience team to create analytic segments based on those personas. Now, probably not everything in there is going to be something that you can measure in analytics, but the goal is to get as close as you can. For example, we take the persona of Claire. She’s health conscious, she’s into fitness, she’s into nutrition, and she’s busy, so she likes things that make her life easier.

Slide 57:

We might build a segment in analytics that says, “Her online hours are likely to be after the kids are in bed. She may be more likely to visit nutrition and recipe content. Or she might login so that she can save recipes. Or she may send them to email. Or pin them on Pinterest.” Building this view in analytics of what Claire does. This can also be a great way to bring your UX team and get them interested in analytics, because you’re framing analytics in a way that they understand or they’re really familiar with.

Slide 58:

One last note when you’re trying to understand the user experience, is that if you’re going to go through the steps and you’re going to try to put yourself in the shoes of your user, we need to do a good job of that. I have seen lots of for instances where analysts will try to understand, “Why does our mobile form perform so badly?” Then they kick open a browser window and they masqueraded as IOS and use their keyboard and mouse. That’s really not going to show you the actual problem of big thumbs and little form fields. Making sure that you’re staying true to the user experience can better help you to gauge it.

Slide 59:

So now, when would you use these types of methods?

Slide 60:

A good time is when you’re starting an analysis, or even when you’re trying to come up with a problem within your organization that’s worthy of a custom analysis. That analysis can then help you figure out how big a problem it is. Then bring in optimization to try and solve it.

Slide 61:

Definitely qualitative measures can be super helpful while you’re running AB tests because while obviously we want to know what wins, it’s also really important to understand why it wins. Even if test variation doesn’t perform better, for us to really learn from testing, we need to understand why.

Slide 62:

You may be looking for testing opportunities. Maybe its in running an optimization program for a few years and at the start, ideas were free-flowing, they were coming from everywhere, and now everybody gotten a little stumped. So think about using this qualitative information to inform your next test.

Slide 63:

A redesign. This user feedback is critical if you’re undergoing redesign. You might have ideas, but maybe your users have something that would work better. So using them and using those ideas at the beginning, middle, and end, all the way through that process.

Slide 64:

Keep in mind, with user testing, you’re not limited to testing your own site.

Slide 65:

You’re able to actually test against your competition. You could test their website, you could test a similar feature on their site versus your site and gain some information about your own performances.

Slide 66:

Maybe you’re thinking, “Okay, I’ve done some of this.”

Slide 67:

How do I integrate this into my analysis?

Slide 68:

One example is to use quotes. And quotes can help bring a number to life. While something may seem alarming, actually hearing the voice of the customer may help action to be taken more quickly.

Slide 69:

You can use word clouds to share themes if people don’t want to read through a lot of customer feedback, but you want to get a little bit of that qualitative insight in there.

Slide 70:

There’s different way that you can chart it. You might be looking at whether it’s positive or negative and maybe taking into account what the value of that customer is.

Slide 71:

This can also be a way of focusing your efforts to say, “We’re really just going to present our executives with the feedback from our higher value customers that have negative feedback.”

Slide 72:

You can also group the feedback. Understanding what the purpose of somebody’s visit was, and then comparing it to the traffic to that content may help unearth problems in architecture. Maybe content that people say is the purpose of their visit, they’re actually not finding because of poor information architecture.

Slide 73:

Your plan of attack.

Slide 74:

A couple of things that you can go out and do. One is to try and run a survey alongside your next AB test. Take the holidays, do a simple user test on family and friends for your next analysis topic. And also generally explore what tools are available to you and start to build relationships with your UX and customer service teams.

Slide 75:

Keep in mind, I’m not saying qualitative data is better, or that it’s worse than quantitative data.

Slide 76:

It’s just different and it’s going to give you a different picture. Hopefully it should give you a more well-rounded picture.

Slide 77:

We always need to be understanding of the limitations of our data, so keep in mind that there’s an entire theory around how our behavior changes when we’re being observed. Certain methods like user testing may be subject to this. But by weaving the two together with quantitative, you can help to avoid and mitigate some of those issues and give yourself a better analysis as you go.

Slide 78:

The moral of the story here is that the more data you can use to understand the bigger picture and not be limited by just one tool, the better your analysis is going to be at uncovering the motives behind your users’ actions. And really that’s what analytics is all about.

Slide 79:

Thank you guys so much for joining me. Like I said, my name is Michelle, there’s lots of ways to reach me. You can ask a question in the QA window, or you can always tweet me questions after the fact, or post them in measure Slack. Join .measurechat if you’re not a current member.

Previous Video
Matt Hertig- The Truth About Marketing Analytics
Matt Hertig- The Truth About Marketing Analytics

Join Matt for this engaging session as he shares the best solutions to automate your marketing data gatheri...

Next Video
Rusty Warner & Susan Vertrees- Mastering Customer Behavior Data with Centers of Excellence
Rusty Warner & Susan Vertrees- Mastering Customer Behavior Data with Centers of Excellence

This presentation shows where orgs are losing confidence in data and provides suggestions for improving hol...