It’s now generally accepted that the better the experience on a website or app, the more revenue that website or app will generate. But how can brands put a specific number on this relationship to shed light on exactly how much they stand to gain by improving experiences?
This presentation with Decibel CMO Shane Phair will outline how three global brands conclusively answered this question. Shane will explain how, by harnessing the latest experience analytics technologies, these leading companies quantified digital customer experiences to establish an explicit link with revenue performance, revealing how even small improvements translate into tens of millions in additional online revenue.
By attending this presentation, you will learn:
- The methods used to correlate digital customer experiences with conversion and revenue
- How small improvements to the digital experience can translate into tens of millions in additional online revenue
- Recommendations for how to improve customer experiences and drive conversion
Shane Phair is CMO at Decibel, the digital experience analytics solution. With over ten years' leadership experience in marketing and sales, Shane previously launched the CM Group brand and rebranded several portfolio organizations, including Campaign Monitor, Emma, and Delivra. He holds an MBA from Northwestern University's Kellogg School of Management.
Hello everyone, my name is Shane Phair. Today, I'm going to talk to you about how linking digital customer experience with revenue can predict millions for your brand. Here's just a little background about who I am, and if you haven't heard of us, what we do here at Decibel. So Decibel helps brands increase online sales and loyalty by automatically finding the issues on their websites and apps, which frustrate and repel customers.
As for me, I've been in leadership roles in digital marketing for over 10 years. As Decibel's CMO, I lead global marketing and communications guiding our brand through international expansion. Prior to Decibel, I launched the CM Group family of brands, including several portfolio organizations Campaign Monitor, Emma, and Delivra. So, as I mentioned here at Decibel, we enable brands to understand, improve and react to user experiences on their websites and apps. So they can radically increase loyalty in sales. You can see a few of our claims to fame on the side here, but essentially we're about measuring the quality of customer experience online and reporting that back to teams and easy to understand compelling ways.
We provide that insight to some of the biggest brands in the world. We are the only experience analytics technology built specifically for enterprise. So now, you know a bit about who I am and what we do here at Decibel, today I want to talk a bit about the digital experience you offer through your website or app and how correlating to revenue can be a very powerful way to find and prioritize opportunities for improvement. We've all known anecdotally for years, that better online customer experiences translate to more revenue. When customers have a good experience online, they spend more and tend to be more loyal. So why do online customer experience initiatives still struggle to get signed on? Usually it's because the relationship between experience and revenue isn't explicitly clear. There's often no tangible universal number for digital teams to reference and say, "Hey, look, if we improve digital experiences by X, then we'll generate Y and additional benefit." This can lead to a certain fuzziness or uneasiness around projects that are solely geared towards improving digital customer experience. It seems like a big investment of time and resource, and it's potentially risky when there's no clear return.
Historically, this struggle has been rooted in teams not having insight into the quality of digital experiences at scale. Web analytics reveal what happens on websites and apps at a large scale, but they don't tell you why it happened. They're not designed to analyze a report on the quality of the actual user experience. Voice of customer tools and MPS scores give customers an opportunity to tell what their experience was like or why they behave the way they did, but this is at a small scale and is sometimes unreliable. If teams can't even reliably quantify how good experiences currently are on their website or app, they have no hope of correlating those experiences with revenue opportunity. So here at Decibel, we set out to fill this knowledge gap by building technology that quantified and revealed the quality of digital experiences at scale.
We did this by pioneering a brand new metric, the Digital Experience Score, or DXS. DXS scores every single user experience that occurs in a website or app from zero to 10. And it does so automatically across all audience segments, without having to ask customers a single question. Here's how it works. First up, as soon as it is deployed on a website or app Decibel captures every moment of interaction within the user session. From mouse movements, hovers and touches to scrolling device rotations and pinches, all the while associating these events with the underlying context of the content being viewed. This raw experience data is processed into smart experience metrics based on categories including distance, velocity, movement, focus and hesitation.
Moving a level up, the enriched data feeds our behavior detection algorithms where Decibel automatically categorizes specific user behaviors that in turn indicate the visitor state of mind. For example, these behaviors include multi-clicks, which is where a user rapidly clicks or taps on a typically broken link or button. And this is a sign of frustration. We also have birds nest behaviors, right? Which is where the user radically moves their mouse around the page unable to find what they're looking for. And again, this indicates frustration. Then you also have more positive behaviors like scroll engagement, which is where users scroll down the page in a smooth regular pattern signifying the consumption of content and thus a good sign of engagement, as well as reading behavior where readers and users actually follow the onscreen content with their mouse. We have many more behaviors of this kind.
Finally, so with DXS, there's no longer a need to maintain custom KPIs or rely on a single subjective score, like MPS to track the quality of experience, right? Instead you have a score that's powered by machine learning, offers a quantifiable measurement of every customer experience across websites and apps, and it provides a KPI by which digital teams can benchmark and measure improvements or declines in the customer experience. And the best thing is we reverse engineer all this machine learning and visualize all the calculations to reveal exactly what contributed to the DXS score, be it poorly designed navigation or a frustratingly slow experience. We provide teams with a checklist of actions and experience issues they can then use to investigate and improve their DXS.
With DXS we have our first piece of the puzzle. We're able to quantify digital experiences across websites and apps at scale and assign them a quality score between zero and 10 that everyone can immediately understand. So how can we move on to the second stage, tying the score that denotes the quality of the experience offered by a website and app to an actual revenue number? For this, our data scientists have performed a number of validation studies with some of the world's biggest companies to confirm the correlation between DXS which again is the quality of a users experience and the revenue goals associated with those experiences.
We've done these validation studies with multiple brands across multiple industries and here are the results from TUI, for example, who is the world's biggest travel company. After analyzing millions of user sessions, our data scientists found that just a one point increase in the average site-wide DXS would equate to a $30 million increase in additional annual revenue. Equipping TUI with the tools to not only quantify experiences, but also link them to revenue, enabled them to make a lot of headway internally. When it came to signing off on initiatives, designed to improve customer experiences, they can say, look, here's where the quality of customer experience currently sits. Here's our checklist of actions that will improve our DXS. And if we improve our DXS by one point site-wide, we know we'll potentially increase revenue by additional $30 million per year. This kind of precision was incredibly valuable for the TUI team when it came to improving their websites and apps.
Next we did this validation study with one of the largest hotel chains in the world and there was even more opportunity to increase revenue by improving experiences. Our data scientists found that just a 0.5 increase in DXS equated to a 45% increase in the probability of a user converting, which in turn could drive an additional $20 million in monthly revenue.
The validation study I'd like to go through in a bit more detail today is this one with British retail giant River Island. Our data science found that improving DXS by a score of one correlated with an increase in conversion probability of 43%, which in turn could potentially drive an estimated 4.2 million in additional monthly revenue. And so now I'm going to take you through the steps of the study to show you how we statistically linked digital experience in revenue, and why doing so was so powerful for the River Island team. It not only gave them a clear correlation between digital experience in revenue, but also showed them what the experience issues were in their checkout process with clear recommendations for rectifying them to improve conversion in revenue.
So the goal of the project with River Island was to understand if DXS directly correlated with achieving certain goals, which represent a conversion funnel on the River Island website. In the stages of that conversion funnel were threefold. First we had add to shopping cart, review and pay and confirm order. Like in addition, we explored how browsing on different devices, whether it was desktop tablet or mobile affects the digital experience in a customer's likelihood to convert.
There's a quick summary of the study's methodology, right? Our data scientists analyzed around 1 million unique session over a 10 day period. Only sessions with four or more page views were considered as part of the study. Most sessions were on mobile devices, followed by desktop and tablet. And out of all sessions around 22% of visitors initiated a purchase by adding items to their shopping cart. Here's the key analysis for what our data scientists found in relation to the digital experience score, it's individual experience pillars, and conversion goals.
The results of the study proved that the Digital Experience Score increases customer's progress through the checkout form. In other words, the better experience a customer's having as measured by DXS, the more likely they are to continue their journey. Customers who confirm their order showcase the highest DXS while the ones who only added to cart show the lowest DXS values. It wasn't just the overall Digital Experience Score that showed correlation with conversion goals. The individual pillars of digital experience did as well. Here, we can see the engagement score also increases as customers progressed through the funnel. Customers who confirmed orders showcase the highest engagement, contrarily customers who did not initiate a purchase experience the lowest. For the Technical Experience Score, the biggest difference in distribution lies at high values. On average customers who reached the last two steps of the checkout funnel show a slightly lower average technical score compared to customers who did not start the checkout. This was mainly influenced by high page rendering and loading times on River Island's checkout. A key experience issue surfaced by the study which we'll discuss here shortly. Most forms are present during the checkout process, which results in a broad variation of the form experience score the form experience score increases as the customer's progress from reviewing their order and paying to confirming the order.
Now let's move on to the analysis of the relationship between DXS, the Engagement Score, and Conversion Probability. So the results of the study proved a positive correlation between River Island's DXS and the achievement of conversion goals. The proportion of sessions where customers added to their shopping carts, reviewed and confirmed their order grows as DXS increases. There was also a strong correlation between adding to the shopping cart in the Engagement Score. As the engagement grows, the probability of purchasing increases. In terms of the Conversion Probability for the River Island website when the quality of the experience denoted by DXS is around 5 out of 10, the likelihood of a user converting stands at 0.8%. An increase of the DXS to around seven results in a 2.5x higher probability to convert at 2.8%. When the DXS is around 10, there is a 6.5% probability to convert conversely sessions with a DXS less than four align with near zero Conversion Probability.
In other words, if the quality of the experience is beneath a 4 out of 10 as scored by DXS, it is extremely unlikely a user will convert on River Island's website. With these numbers in mind, if we take into consideration River Island's average order value of $55 and a traffic of 6.7 million sessions per month in improvement of the DXS by one would increase the conversion probability by 43% on average from 2.8% to 4%. This means an estimated monthly increase of $4.2 million in revenue. So that right there is the power of linking experience to revenue. The River Island team now has a clear number in mind when establishing the value of improving digital experience on their websites and apps.
As I mentioned earlier in the study, we also looked at how DXS and conversion played out across different devices. The analysis revealed that the percentage of customers who reached the "confirm order" step is higher on desktop than mobile or tablet. However, the proportion of visitors who added to their shopping carts and proceeded to payment remains very similar across all devices. The similarity of the DXS trend for "add to cart" and "review and pay" across all devices suggest that customers are all have similar experiences when browsing and initiating the checkout, regardless of the device type. However, there is a higher percentage of sessions, which result in a purchase on desktop, especially when the DXS is higher. And the listeners on mobile devices are less likely to complete the checkout process, even though the segment comprises the most traffic.
Finally, out of our correlation analysis we were able to pinpoint the exact experience issues that were causing a reduction in the access throughout the River Island site. Customers who dropped off after adding to their shopping cart, showed a higher percentage of engagement related issues on product listing pages and product detail pages. Engagement Experience Issues included the following. We had low scroll reach, low focus time, low interaction time, low engagement with texts, and low engagement with images. Once more customers who did not proceed to "confirm order" showed a significantly higher percentage of form abandonment on their checkout pages. Often due to technical experiences, more than 90% of customers who drop off at this step did not finish filling in a form.
Having established these experience issues, we then went on to provide recommendations for what River Island could do next. You can see these recommendations on the slide now. The key takeaway is that by crunching all of these user sessions with DXS, then reverse engineering them to see exactly what drove poor DXS scores, we were able to pinpoint exactly how River Island can improve website experience issues and demonstrate, thanks to the correlation analysis, how this would move the needle in terms of revenue.
So to recap, the study we embarked upon with River Island to link digital customer experience and revenue provided and proved a positive correlation between DXS and the achievement of conversion goals. Results confirmed that the DXS increases as customers progressed through the checkout funnel. With the highest DXS values aligned to the segment of customers who converted. Also there is a strong correlation between customers adding to their shopping cart and their engagement scores. So as the engagement grows, the probability of purchasing increases. Improving River Islands DXS by one, results in a 43% increase in the probability of conversion, which in turn could drive an additional 4.2 million in monthly revenue. As Robert Brown, Head of Digital Practice at River Island said, "As a business focused on delivering fantastic online experiences, we looked at DXS to provide an objective measure of those experiences. We were delighted to see its power validated through this study." You can actually find the full written studies of not just this River Island but also the earlier validation studies I mentioned on Decibel.com. Those are all visible there.
So, why do customer experience initiatives on websites and apps still struggled to get sign-off? Well, usually as we've said, it's because this tie out to revenue isn't clear. Up until now, it hasn't been something that you can put a metric on, a number on. It's more been based on a belief rather than sort of empirical evidence. And so it's been critical to move beyond anecdotal insights into sort of concrete, empirical mathematical conclusion. So Decibel is closing this knowledge gap and allowing teams to put actual numbers behind their predictions. This massively helps them achieve sign-off on user experience projects, but also gives them a clear pathway to improving experiences as well. They know that by optimizing DXS following the guidance of Decibel's technology, they can easily optimize conversion and revenue.
As a quick summary on what I've spoken about today and the process we looked at in detail with River Island, here's a quick link between the digital customer experience in revenue, right? So we start by quantifying the quality of digital experiences by DXS. This is of course, a crucial step as it puts an actual number, an actual metric to digital experience they could used in later analysis. We then correlate that analysis to conversion behavior, so what's the current average digital experience score, what's the current average conversion rate, and what's the relationship between the two? Based on this, you can then simulate what an improvement to the average DXS would do to conversion. So if DXS improved by one, how much would your conversion rate improve? We then want to take that percentage conversion increase and multiply it by your average order value in traffic. This is how you get your potential revenue number, which becomes a vital tool internally to get sign-off and experience initiatives. Finally we reverse engineered DXS for a clear path to improving digital experiences to get more conversion in revenue. The beauty of Decimal's technology and DXS is the same analysis that gives you an ROI value also gives you clear next steps for how you can achieve that ROI.
This brings me to the end of the presentation. Thank you very much for your time and thank you to ObservePoint for having me and Decibel here today. As mentioned all four write-ups of the study speech from this presentation about linking digital experience to revenue, as well as more information about the Digital Experience Score can be found on Decibel.com.
For now I think we have time to open up for some questions. I can see a couple come through. "Does the Digital Experience Score provide a score for every session?" The short answer is yes, it does. Obviously when you have millions of sessions a month, an individual score for every single session might be a bit overwhelming for analysis, so that's why the score also rules into to audience segments, pages, funnels, things like that even an average score for the entire website or app. The short answer is yes, we we do provide a score for every session.
Next one is, "how can I convince my bosses we need this kind of technology?" Good question, but as a CMO I can say I always judge technology by its potential impact on revenue, impact on the bottom line. So will it save us money? Will it save us time? Will generate more revenue? I would suggest leading with this kind of business case when suggesting a technology focusing on its potential return on investment rather than just how it would make your day to day easier. So that's important as well. But I would definitely when making the business case really focus on the tie-up to revenue. And I think with Decibel specific DXS everything we do is tied to the notion of matching experience issues with revenue, and that's what we're all about. I think that the business case you know, with this type of technology is very clear. We know that a great experience drives sales and drives customer loyalty that much, it's just scientifically evident.
Next one, "any advice for linking digital experience in revenue when you don't have access to something like Decibel?" Good question again. I think the most important thing in finding a reliable way to quantify experiences across your site is to do just that, to quantify experiences across your site. So once you have a measure of the current quality of experiences, you'll able to tie it to conversion rate and workout experiences that would impact the revenue number. So of course, Decibel would make life easier by doing all of that for you, with the Digital Experience Score, but without that, I would say that the only way to do that really is to find some other way to link to quantification.
I think that's all the questions we have time for, so any more details about anything we've mentioned today had the Decibel.com or of course stop by our booth here today. Thanks again for joining and thanks again to ObservePoint for having me. Appreciate it.