2020 Digital Analytics and Governance Report

January 22, 2020

In today’s business climate, digital analytics is more important than ever. To stay competitive and deliver amazing experiences to your customers, you need to be a data-driven business.

But what does a data-driven business look like? What are the current roles and team configurations in the digital analytics space? What analytics solutions are companies using to improve their use of data? How much do companies trust their current digital analytics practices and what can they do to improve those practices?

ObservePoint took to answering these questions and more in a recent survey, which had more than 700 responses from industry professionals of varying positions related to all things digital analytics.

Continue reading below to discover the findings.

Who We Surveyed

Respondents had various roles related to digital analytics. Most respondents self-identified as digital marketers, making up 26% of the total. Digital marketers were followed by digital analysts at 20% of respondents. The remaining 54% had various other analytics-related roles or did not specify their roles. Below are respondents’ job titles and a breakdown of roles.

What is your current job?

Roles and Teams

In addition to their individual roles, the survey respondents were asked to indicate their seniority level and team name to help clarify the audience behind the survey data.

For those who responded, self-identified digital marketers were slightly more likely to be managers/supervisors while digital analysts were slightly more likely to be individual contributors.

Team names held the same trend as individual roles, as the majority of our respondents were members of either an analytics or marketing team.

What is the name of your immediate team?

Company Profile

While our survey pool included respondents from companies of all sizes, nearly 40% of our respondents were from large companies with more than 2,500 employees.

The industries of our respondents varied greatly but were most concentrated in industries related to communication technology, consumer products, and marketing.

Misaligned Perceptions of Analytics Data Accuracy

Misalignment between different teams concerning digital analytics data accuracy can cause ongoing issues in an organization. When teams perceive data accuracy differently, united decision-making can become difficult.

The survey’s results offered a glimpse into these differing perceptions about data accuracy, as shown in the following graphs.

Overall, the majority of respondents indicated they believe their data is about 70% accurate. While 100% accuracy is impossible to achieve, a median of 71- 80% accuracy is a relatively low standard to clear, and suggests that there is a lot of room for improvement.

However, when looking at the differing responses between digital analysts and digital marketers, an interesting story exposes itself.

The graph of digital analysts (who are likely much closer to a company’s data) has more variation, suggesting that analysts are more skeptical of their company’s data accuracy. The digital marketer graph, on the other hand, is more closely grouped and higher on the scale, indicating a firmer and more consolidated view of data accuracy.

The difference in results seems to indicate misalignment regarding data accuracy and practices. Since analysts are closer to data and therefore likely have a better grasp on data accuracy, one potential explanation of their more pessimistic view is that they see data inaccuracies that end users of data further down the line (like marketers) aren’t aware of.

Whether real or perceived, these inaccuracies could affect the analyst’s ability to confidently make strategic recommendations to other teams based on data.

How Testing Frequency Influences Perceived Data Accuracy

One of the objectives of the survey was to see how accurate companies considered their data to be based on how often they tested the analytics tracking of their critical user experiences. The data suggests that the more often companies test their analytics tracking the more they will trust their data.

While increasing how often a company tests data accuracy seems like a straightforward way to increase trust in data, increasing the frequency takes time and resources, especially when attempted manually. As a result, companies should carefully consider the resource trade-off associated with more frequent testing of their analytics implementation. A key component of deciding how frequently to test will involve considering whether or not to use automation to free up time and resources while also maintaining more accurate data.

How the frequency of analytics verification affects perceived data quality

In addition to gaining insight into perceptions of data accuracy in relation to how often companies verify their analytics tracking, the survey also enabled an investigation of perceptions of data accuracy based on how many hours per week companies spend validating data collection or cleansing data. A quick visual test of the data suggests that up to an unspecified point, the more time you spend verifying your analytics tracking, the more you will trust your data.

The graphs to the right show a steady trend of increased trust in data as hours per week spent validating or cleansing data went up. This data supports what the previous analysis suggests, that more accurate data requires more time and resources for testing, which should be heavily considered when making decisions about how to improve data accuracy.

How Automated Testing Affects Perceived Data Accuracy and Time Spent Testing

Automated testing for analytics data involves using software to run automated scans on your analytics implementation to check for errors, such as missing tags, missing variable data, incorrect formatting of variable data, and more. These tests help you ensure the integrity of your implementation by locating tagging errors and validating your data to ensure accurate data collection. In the survey, we referred to automated testing as “running scheduled, automated scans” and “triggering scans programmatically after each release.”

An interesting insight from the survey data showed that there was not a noticeable difference in perceived data accuracy for the respondents who use automated testing. This insight could lead to the premature belief that automated testing may not impact data accuracy, but there’s likely more to the story.

When digging a bit more into the data, it is apparent that those who engage in some degree of automated testing actually spend more time testing or cleansing data (displayed in graphs below). This insight could indicate that the perceived data accuracy (displayed in graphs above) didn’t change with the use of automation because those who use test automation have greater knowledge of their true data accuracy and may be noticing that their data problems are more extensive than previously expected.

Analytics Maturity Testing

As websites grow, their associated analytics implementations become increasingly complex, which makes obtaining accurate customer insights a challenging endeavor. The above analyses addressed how the time and frequency companies dedicate to testing affect perceptions of data quality. But how exactly are companies carrying out that testing?

Analytics testing is done manually or through the use of automated testing solutions. Manual testing requires human effort to inspect and correct analytics errors by combing through code on individual web pages, which is cumbersome, time-consuming, and prone to human error. On the other hand, using automated testing solutions helps users test their analytics solutions with the help of software, which frees up time and resources that would have been spent manually checking analytics code.

In terms of maturity, 75% of survey repondents indicated their analytics testing is somewhat mature (45%) to mature (30%), which means most companies in our data pool are working to shift their analytics testing from manual processes to more systematic processes.

A company with an immature analytics testing strategy likely tests their analytics implementation inconsistently and does so manually by combing through pages of code with human effort. A company with a fully mature analytics strategy uses automation to test their analytics implementations on a release schedule. Ideally, a fully mature analytics testing strategy frees up manpower and resources so team members can focus on moving their company forward, instead of worrying about data accuracy. Or, alternatively, an automated solution allows the company to address more data accuracy issues than before, as suggested by the previous analysis.

Validating Data Collection for User Experiences

There’s a lot of buzz in the analytics industry about becoming more insights-driven to deliver more effective customer experiences. As a result, part of the survey focused on finding out what companies deem to be the most critical user experiences on their websites.

While critical user experiences vary depending on the nature of individual businesses, the data suggests the majority of critical user experiences are, unsuprisingly, those directly tied to revenue, including experiences like making a direct purchase of a product or contacting a company. While prioritizing revenue-based experiences may seem to be an obvious pursuit for companies, this focus on revenue is a clear indicator of where firms spend their time and resources.

In addition to finding out which user experiences were most heavily prioritized, the survey also asked about the robustness of web analytics tracking for those user experiences and how that analytics tracking is verified. The following charts provide insight into our respondents’ methods for maintaining analytics tracking on their most important user experiences.

Testing in a Staging Environment

Another objective of the survey was to gain insights about whether companies test their data collection in a staging environment. A staging environment exists on a private server and is where you prepare your site or app to be seen by the public—it is essentially a testing ground for your website before you make changes go live.

Companies who use a staging environment to test their analytics tech are more likely to catch analytics errors before those errors are deployed to a production environment, where those errors can negatively impact customers and their own data.

The survey results indicated almost 74% of companies who responded use a staging environment for testing, and for those companies who don’t use a staging environment (26%), we found the primary reasons to be as follows:

  • They have never considered deploying and testing tags in a staging environment (45%).
  • They don’t have a staging environment (35%).

The 25% of respondents who don’t test their analytics implementation in a staging environment are at greater risk of negatively impacting their customers or their data by publishing analytics errors to their live website.

Furthermore, the statistic indicating that 25% of respondents don’t use a staging environment combined with the fact that 45% of these respondents have never considered using a staging environment shows how great the potential for digital analytics improvement is for these companies.

Staging Environments and Perceived Analytics Testing Maturity

The use of staging environments appeared to have a strong relationship with how mature companies considered their analytics testing. Those who used staging environments considered their analytics testing to be much more mature than the companies who didn’t use a staging environment.

Improving data quality by utilizing a staging environment for analytics testing requires an increase in time and resources, and when a staging environment is used, companies feel more mature in their analytics processes.

Staging Environments and Perceived Data Accuracy

The use of staging environments also appears to be connected with how accurate companies consider their analytics data. The data suggests that companies who use staging environments to test digital analytics are much more likely to consider their analytics data to be more accurate.

Again, this data articulates the story behind the trade-offs between data accuracy and using time and resources.Digital Analytics Solutions and Tech

Digital analytics solutions abound, and a key focus of the survey was to find out which analytics solutions respondents primarily use. Google Analytics (57%) and Adobe Analytics (35%) are the most common analytics solutions in our survey demographic.

 

When analyzing analytics solutions by company size, the survey results indicate that smaller companies tend to use Google, while larger companies gravitate towards Adobe. A simple insight from the graph below is that larger companies need a more robust solution allowing for more customization, and they choose Adobe as a result.

Digital Analytics Solutions and Testing Maturity

Another goal of the survey was to see how specific analytics solutions impact a company’s perceived analytics testing maturity. While it may be obvious, the graphs on the following page show that companies using a more robust (and more expensive) analytics solution consider themselves to have more mature analytics testing, which leads us to believe that companies who increase investment in their analytics solutions prioritize their analytics testing more than others. An alternative interpretation could be that companies associate cost with quality in the form of accuracy, resulting in a perception of accuracy because they purchased a premium solution.

Tag Management Systems

In terms of tag management systems (TMS), the survey results indicate that Google Tag Manager (GTM) is the most popular TMS and is used by nearly half of our respondents. GTM (49%) is followed by Adobe Launch (18%) and Adobe DTM (17%), Tealium (9%), and then various others (7%).

 

Some respondents indicated they were using multiple TMSs on their site, which can be problematic. Utilizing multiple TMSs on a site is inefficient, because you have more to manage and keep up with than you would with a single TMS. Additionally, using multiple TMSs can increase your site load times. Each TMS has its own way of loading tags, and as a result, combining two TMSs requires more bandwidth than a single TMS. One TMS loading 10 tags is more efficient than two TMSs each loading 5.

Migrating from Adobe DTM to Adobe Launch

For those still using DTM

With the imminent sunsetting of Adobe DTM approaching, a portion of the survey focused on companies’ plans to migrate from Adobe DTM to Adobe Launch. 67% of respondents using Adobe DTM said they would migrate to Adobe Launch, while 31% said they weren’t sure about the details of their migration, and the remaining 2% said they would migrate to various other solutions.

For those companies planning on migrating from DTM to Launch, the survey indicates the majority (67%) plan on spending roughly 1-6 months making the switch. If you haven’t started your migration yet, this 1-6 month projected time for the switch will be a useful time metric, but make sure to factor in the time and resources you will need to make the switch, and adjust the projected time accordingly.

For those using Launch

Those planning on migrating to Adobe Launch were largely correct in their migration time estimate, as 69% of the respondents already using Adobe Launch indicated a migration time of 1-6 months, while the remaining 31% of respondents indicated a migration time of 6 months or more. The companies who took 6 months or more likely had limited resources to dedicate to the switch, but this is only speculation.

Looking Forward to 2020

The world of digital analytics and analytics testing is primed for growth in 2020, especially since only 5% of companies indicated their analytics testing is “very mature.” As a result, we estimate many companies will gain an edge on their competition by optimizing their digital analytics strategy.

Additionally, a key theme emerged in our survey, which involved the trade-offs between time, resources, and data accuracy/analytics testing maturity. Obtaining more accurate data and implementing a more mature analytics testing strategy will undoubtedly take more time and resources than companies are currently employing.

The time and resources necessary to improve data accuracy will need to go toward using more robust testing technology and increasing a company’s ability to test their analytics implementation. Increasing this analytics testing ability will likely require companies to more fully adopt automated testing solutions and the use of staging environments.

Luckily, the use of automation can help companies free up a portion of the time and resources associated with pursuing more accurate data (although purchasing or creating an automated analytics testing solution will have its own trade-offs in terms of time and resources). The saved time and resources associated with using automation will enable analytics stakeholders to focus more heavily on furthering higher priority business initiatives while still maintaining accurate data.

Overall, the future looks bright for companies embracing a more robust analytics strategy in 2020, but companies need to be mindful of the aforementioned trade-offs as they work to improve customer experiences through more accurate data.

About ObservePoint

ObservePoint helps analytics teams focus on delivering amazing experiences to their customers and moving their businesses forward by bringing order, automation, and insight to the chaos threatening web and app technologies. To see how ObservePoint’s automated testing solution can help you secure accurate data with your analytics implementation, schedule a sample web analytics audit.

 

 

Previous Article
16 Takeaways from Analytics Leaders in 2020
16 Takeaways from Analytics Leaders in 2020

This blog post includes some of the key insights gained from analytics professionals at ObservePoint's 2020...

Next Article
What Is an Analytics Implementation?
What Is an Analytics Implementation?

To take your business to the next level you need an analytics implementation. In this blog post you'll lear...

Get a Custom Pre-Recorded Audit

REQUEST YOUR AUDIT