Every Company Needs a GPS for its Data Assets — Here’s What It is and Why It’s Required?
Data is a proxy for the operations of a company.
There are four major challenges facing companies today — regulation, transformation, governance and optimization (see diagram below for details). Their data ecosystems are becoming more complex — with the introduction of new and improved data warehouses and data lakes, data pipelines, and business intelligence and analytic capabilities in the cloud.
The reality is that lines of business, technology, data and people are siloed, which necessitates navigating an invisible landscape of fragmented tribal knowledge to discover, understand and utilize data and ensure robust enterprise Governance, Risk and Compliance (GRC).
Companies need a GPS for their data assets to provide a self-service capability to associates to access critical tribal knowledge related to business, technology and data assets and interrogate it to generate actionable insights. The conceptual model of the GPS is shown below.
The “GPS for Data Assets” glues metadata from enterprise architecture, data privacy, master and reference data, data observability & data quality, and any other metadata tools and provides a holistic view and sophisticated search capabilities of a company’s data ecosystem, linked to it’s business and technology assets.
A picture is worth a thousand words.
Two sample views of the GPS for Data Assets are shown below (Credit Solidatus):
The Business Case for the “GPS for Data Assets”
Companies should democratize data and shift people from thinking of an individual data source towards the concept of an integrated data-space — GPS for Data Assets; the data-space presents a holistic view of the metadata and data and thus provides the necessary context for people to be data informed.
To help our clients develop a business case for the “GPS for Data Assets”, AlyData has developed an opportunity quadrant shown below, to highlight four specific areas that companies should consider for cost savings, risk mitigation and value creation — data discovery, data quality, dark data and data-to-insight overhead.
Most companies are losing millions due to the significant cost and efficiency impacts of the following (the numbers used here are from industry sources):
- Data Discovery: 50% of an associate time is spent “discovering and understanding” data. Although asking a colleague about data is easy, it is highly inefficient at scale. It can be hard to re-train yourself to search for data on your own.
- Bad Data: Poor quality data 6% of a company’s operating expense. A recent Gartner survey of reference customers for the 2020 edition of “Magic Quadrant for Data Quality Solutions” found that organizations estimate the average cost of poor data quality at $12.8 million per year. Most organizations merely react to data quality problems as they arise (reactive) rather than proactively monitor and address data quality issues.
- Data-to-Insight Overhead: Data analysts, engineers, report writers and data scientists typically spend 80% of their time on discovering, analyzing, integrated, and preparing data — the data-to-insight overhead.
- Dark Data: Dark data that is captured, processed and stored, but never used). Industry estimates state that 80% of data within companies is dark.
We estimated that the annual cost savings for a client with 4000 associates and an annual operating expense of $40 Million would be greater than US$8 Million for data discovery, bad data and data-to-insights. This is a very conservative estimate and doesn’t include savings from exposing dark data to derive insights and mitigating risks related to data privacy and security (conservative estimate of more than US$3 million).
A big picture of the business, data and technology ecosystem and the relationships between them — are key to breaking down silos, conducting impact analysis, conducting business continuity planning, and gaining actionable insights into business, data and technical risks.
Companies should democratize their data assets for their associates by providing them a GPS for Data Assets by gluing metadata and associated data that reside in enterprise silos. Associate productivity will increase significantly since they will be able to visualize a multitude of contextual and temporal truths. In addition, this solution provides actionable intelligence to enable impact analysis, business continuity planning, business, data and technology risk management, governance and regulatory compliance and moves companies to become proactive and agile-enabled.
About the Author
Jay is the Founder and Managing Partner of AlyData — a consulting firm specializing in all things DATA. My team and I help CXOs drive out-sized business outcomes from their data assets — through our CDO Advisory, Data Governance, Data Risk Management, and Insights consulting services. We’ve developed proprietary frameworks and methodologies and utilities for data risk management, data governance, and data quality — which enable us to deliver high-quality solutions faster.
- Data Discovery: https://www.itproportal.com/features/study-reveals-how-much-time-is-wasted-on-unsuccessful-or-repeated-data-tasks/
- Data Quality: A recent Gartner survey of reference customers for the forthcoming 2020 edition of “Magic Quadrant for Data Quality Solutions” found that organizations estimate the average cost of poor data quality at $12.8 million per year. … Most organizations merely react to data quality problems as they arise. — Jun 22, 2020 (https://www.gartner.com/smarterwithgartner/how-to-stop-data-quality-undermining-your-business/)
- Dark Data: https://www.splunk.com/en_us/blog/leadership/dark-data-has-huge-potential-but-not-if-we-keep-ignoring-it.html
- Dark Data: https://blog.datumize.com/evolution-dark-data
Originally published at https://www.alydata.com on April 26, 2021.