As we leverage vast quantities and categories of data from disparate sources for business value, including predictive analytics, it’s clear we’re overdue for a focus on data confidence.
Let’s talk about an “enterprise data practice efficiency model”. Data maturity models made popular by Gartner, IBM, Deloitte and a host of others are useful, typically categorizing the enterprise in one of four states based on a number of indicators in a single domain.
Why two domains? This is a “whole brain” approach. If either domain is deficient, the enterprise cannot extract optimal business value from their data investment. Worse yet, the data will fuel costly wrong decisions by the c-suite. So many c-suite executives recognize their data utilization is poor, yet they cannot tell you specifically why or how to prioritize initiatives toward improvement. The EDPEM provides such a prioritized roadmap.
So, here are “the left and right brains of data practices”:
2. UTILIZATION (qualitative): a summative score based on surveys of all the stakeholders who use the data for front-lines or high-stakes decision making. For instance, does each stakeholder trust the data? Does the stakeholder know how to utilize “data set x” toward a decision? Does he/she know the proper policy to correct inaccurate data they discover? Or do they even know the dataset exists?
Ultimately, if your enterprise has a data problem it’s either low integrity (which is systemic and be fixed technically or by change in governance) or low utilization (which can only be fixed by training and changing your “data culture”). And the assessment points directly to each dataset- how strong it’s integrity is and how it is or is not being utilized. So, a natural roadmap for improvement emerges, based on your strategic priorities and the nature of interventions needed.
We now have line of sight for new opportunities to leverage data. Let’s make sure it’s smart data, and we use it wisely.