“Our performance data is pretty clean, but it doesn’t help anybody make a decision,” says one corporate controller at a global pharmaceutical company. The Gartner 2019 Data Management Model Survey shows that 73% of finance functions favor a centralized, tightly governed source for data. But while that serves the governance mandate, it may not serve the business well.
“Business leaders largely agree that data from finance are often out-of-date, inconsistent or incomplete,” says Clement Christensen, Senior Principal Advisor at Gartner. “Finance must think more broadly about making data decision-ready.”
“Sufficient versions of the truth” are often enough
Organizations now create and use a vast and growing amount of transactional, operational and other enterprise data. Finance often favors a single corporate-approved view, housed in a central repository that is optimized to load the data into reports and business intelligence (BI) tools.
This accurate and reconciled “single source of truth” can be especially useful to organizations that require precision — or struggle to aggregate or compare data across business units. But this view isn’t always useful for decision making, especially if the data is provided without context — or is communicated in a way that the business can’t easily absorb.
Gartner research shows that finance needs a more pragmatic and flexible way to deliver comprehensive data quickly enough to drive decisions, but it may have to compromise on data accuracy to achieve decision-readiness. To provide “sufficient versions of the truth,” the key for finance is to make informed trade-offs between the cost of bad data and the effort needed for additional data governance.
The business benefits when finance gets this right. The 27% of organizations that pursue a “sufficient versions of the truth” strategy are 41% more likely to generate decision-ready data, and twice as likely to improve the quality of decision making and business outcomes.
Download Gartner Fi.r.st. quarterly journal article: Data Management: A Single Version of the Truth No Longer Works
You still need frameworks for data quality
To improve decision-readiness, you may need to prepare data differently. “Catalog it, tag it, explain it, but let others govern it,” says Christensen.
Under this strategy, finance has to:
- Support distributed data owners, providing guidance on which data to govern and the extent of governance
- Store data in multiple repositories with universal data catalogs and clarify acceptable inconsistencies across datasets
- Curate existing and new forms of performance data in reports and BI tools according to value driver maps
Some fear that data governance will devolve into chaos without a single source, but data quality and consistency remain critical even with sufficient versions. A framework is a good way to drive informed decisions about the trade-offs. Use it to gauge which data quality issues are most critical and assess the cost of poor data in terms of business factors, such as lost time and income, expenses and reputational damage.
Frameworks also help to make data governance more scalable and focus data quality improvements where they are needed most. Mutually agreed data quality frameworks also create shared trust that multiple versions of the truth are good enough for decision making, even if the data isn’t perfect.
If you still need a single source of truth
If you still require a single source of truth, consider using a universal semantic layer (USL). It provides big data analysts with an autonomous, natural language way to clarify data and can help finance leaders resolve data ambiguity and improve decision making.
A USL is essentially a virtual application programming interface (API) that connects all of an organization’s analytical tools to its data sources and data governance systems. Once installed, finance and other business analysts can use it to locate, and work with, their own data through the USL.
The USL creates a single focal point, which is used by disparate business analysts and users of different systems. It enables business analysts to create their own context and definitions at the source of the data, avoiding complex and technical metadata definitions. This makes the data clearer and more digestible to business users, and also can provide a more comprehensive, interconnected view of an organization’s data and information assets.