Pitfalls When Measuring The Success Of Analytics Programs

Bill Franks Bill Franks
May 3, 2021 AI & Machine Learning

There are many factors that go into making an enterprise analytics and data science program a success. At IIA, the application of our Analytics Maturity Assessment methodology to hundreds of companies over the past several years has allowed us to identify some important and intriguing patterns. Recently, my colleague David Alles wrote about some of the challenges faced when advancing in maturity. Here, I’ll walk through a few of the patterns IIA has identified that can appear counter-intuitive at first but make perfect sense upon reflection. 

Capability Without Awareness & Adoption = Failure!

Many analytics and data science organizations have made the mistake of focusing purely on the technical aspects of progress. What is often neglected is the ongoing communication and internal marketing of that progress to stakeholders throughout the enterprise. Those involved in a major project will be acutely aware of its potential and progress, but without expanding that awareness to the eventual users and beneficiaries, progress will not be subjectively perceived and recognized.

It would be terrific if successfully implementing world class analytical platforms, tools, and talent led directly to success. Unfortunately, such efforts are necessary but not sufficient. Success is not only dependent on objective increases in capability, but also on subjective awareness of and excitement about that progress. Further, real success requires the business community to actually adopt and drive value from the improved capabilities. Unless value is actually created, then the capability enhancements are not a success, but an expensive and theoretical opportunity waiting to be taken advantage of. Bottom line: capabilities do not make success, results do! Don’t fall into what David Alles calls the “checkbox” trap.

This can be immensely frustrating for an analytics organization. The team diligently focuses on making technical progress, but that progress isn’t recognized. IIA’s surveys have uncovered this pattern many times. However, if you’re not helping stakeholders to see and understand the progress being made, how can they be expected to recognize it, let alone make use of it? It is absolutely critical to develop and implement a program to frequently internally communicate:

1)    What is planned for the future?

2)    What progress is being made currently?

3)    What are the newest capabilities stakeholders have access to today?

In an ideal world, stakeholders would seek out and keep up on progress on their own. In reality, it won’t work that way.

Technical Progress Necessarily Precedes True Success

Some of the analytics and data science organizations IIA has worked with have been frustrated by the fact that they made true, objective progress in the capabilities they delivered. Yet, survey scores do not reflect that progress. One important point to keep in mind in these situations is that objective capability enhancements necessarily precede both subjective recognition of that progress and objectively realized results. This is because without the enhancements in place, stakeholders can’t take advantage of them. This causes a lag between when a new capability becomes available and the adoption and usage of that capability. A further lag then exists between adoption and usage and the end goal of value generation.

It is important to anticipate and plan for this lag so as not to get discouraged. Turning back to the communication plans discussed previously, it is critical to prepare stakeholders for the delay between capabilities and results. Be sure stakeholders are aware of the progress being made, as well as when they will actually be able to see and make use of that progress. Unless you proactively manage people’s expectations to keep those expectations realistic, you can be certain that the expectations will become unrealistic. Once unrealistic expectations take hold, it can be very difficult to recover from the disappointment that follows.

A Raised Awareness Level Can Hurt In The Short Term

Another counter-intuitive scenario is one where an analytics organization does a good job of raising awareness within the stakeholder community of what is possible and what capabilities are coming. Ironically, stakeholder sentiment can drop in the short term as a result of this new level of transparency. This is because while the intent of the communication is to focus people on all of the positive change coming, the reality is that a lot of eyes are also being opened to what has been missing all along. Once stakeholders better understand what capabilities they don’t have access to today, their subjective sentiment can turn more negative because they now know what they are missing!

In the long run, of course, raising stakeholder expectations and meeting those expectations with greater capabilities will be good for everyone and stakeholder sentiment will rise. However, it can be quite demoralizing to see short term drops in sentiment as a result of what was meant to be a very positive and motivating view of the future. The way to soften the blow of this counter-intuitive pattern is by explaining up front that such a pattern has been seen in other organizations – and why. Prepare stakeholders to recognize when they enter this cycle.

Navigating The Short Term Bumps

Evolving an enterprise analytics and data science program requires tracking your progress objectively with a formal survey that includes stakeholder sentiment along with technical progress. At the same time, as IIA has seen with our Analytics Maturity Assessment work, those surveys can yield results that might appear both counter-intuitive and negative without taking into account the larger context of progress within which the surveys are collected.

To sum up, there are a couple of recommendations that will help your team avoid becoming dejected and / or taking a public beating if objective progress and subjective sentiment are not initially fully correlated:

1)    Never forget that technical progress will not matter if stakeholders are not made aware of the progress and enabled to take advantage of the new capabilities.

2)    Remember that it takes time for a new capability to move from availability, to adoption, to realization of value. Being “done” rolling out a new capability is not the end of the story.

3)    As you make stakeholders aware of the exciting new capabilities coming, you’re also making them more aware of what they are missing today. This can lead their sentiment to become more negative in the short term. Don’t let this be a surprise.

By planning for both the objective and subjective aspects of progress, you’ll be much more likely to succeed in your efforts to improve your organization’s analytics and data science capabilities and the impact those efforts have on the enterprise. The best long-term outcomes will occur if you successfully raise expectations while meeting or exceeding those raised expectations. Just remember that in the short term, things can be perceived worse even as the groundwork is being laid for a much better future.

Originally published by the International Institute for Analytics

  • Experfy Insights

    Top articles, research, podcasts, webinars and more delivered to you monthly.

  • Bill Franks

    Tags
    AnalyticsData ScienceProgram
    Leave a Comment
    Next Post
    The Future Of Selling Driven By The Future Of Work!

    The Future Of Selling Driven By The Future Of Work!

    Leave a Reply Cancel reply

    Your email address will not be published. Required fields are marked *

    More in AI & Machine Learning
    AI & Machine Learning,Future of Work
    AI’s Role in the Future of Work

    Artificial intelligence is shaping the future of work around the world in virtually every field. The role AI will play in employment in the years ahead is dynamic and collaborative. Rather than eliminating jobs altogether, AI will augment the capabilities and resources of employees and businesses, allowing them to do more with less. In more

    5 MINUTES READ Continue Reading »
    AI & Machine Learning
    How Can AI Help Improve Legal Services Delivery?

    Everybody is discussing Artificial Intelligence (AI) and machine learning, and some legal professionals are already leveraging these technological capabilities.  AI is not the future expectation; it is the present reality.  Aside from law, AI is widely used in various fields such as transportation and manufacturing, education, employment, defense, health care, business intelligence, robotics, and so

    5 MINUTES READ Continue Reading »
    AI & Machine Learning
    5 AI Applications Changing the Energy Industry

    The energy industry faces some significant challenges, but AI applications could help. Increasing demand, population expansion, and climate change necessitate creative solutions that could fundamentally alter how businesses generate and utilize electricity. Industry researchers looking for ways to solve these problems have turned to data and new data-processing technology. Artificial intelligence, in particular — and

    3 MINUTES READ Continue Reading »

    About Us

    Incubated in Harvard Innovation Lab, Experfy specializes in pipelining and deploying the world's best AI and engineering talent at breakneck speed, with exceptional focus on quality and compliance. Enterprises and governments also leverage our award-winning SaaS platform to build their own customized future of work solutions such as talent clouds.

    Join Us At

    Contact Us

    1700 West Park Drive, Suite 190
    Westborough, MA 01581

    Email: support@experfy.com

    Toll Free: (844) EXPERFY or
    (844) 397-3739

    © 2023, Experfy Inc. All rights reserved.