January 18, 2018
January 18, 2018
Contributor: Susan Moore
Poor quality data weakens an organization's competitive standing and undermines critical business objectives.
When you receive an email addressed to someone else, how does it affect your perception of the organization that sent it? If customers don’t trust the information held about them, overall confidence in the organization can be quickly eroded.
Poor data quality is also hitting organizations where it hurts – to the tune of $15 million as the average annual financial cost in 2017, according to Gartner's Data Quality Market Survey.
Learn more: Gartner BuySmart™ helps you make smarter tech purchasing decisions.
Mei Yang Selvage, research director at Gartner, says not only are organizations taking a financial hit, poor data quality practices undermine digital initiatives, weaken their competitive standing and sow customer distrust.
“On the other hand, innovative organizations like AirBnB and Amazon are using good quality data to allow them to know who their customers are, where they are and what they like,” says Selvage. “Good quality data empowers business insights and starts new business models in every industry. It allows enterprises to generate revenue by trading data as a valuable asset.”
The good news is that you can strengthen your data quality practices by taking the following four steps.
Nearly 60% of organizations don’t measure the annual financial cost of poor quality data, according to the Gartner survey. “Failing to measure this impact results in reactive responses to data quality issues, missed business growth opportunities, increased risks and lower ROI,” Selvage says.
“Leading information-driven organizations proactively measure the value of their information assets, as well as the cost of poor quality data and the value of good quality data,” Selvage says. “Most importantly, they link these directly to key business performance metrics.”
Many organizations lack the most essential data quality roles or don't know where to place them for optimal impact. This weakens the effectiveness of data quality programs.
To overcome this, help the chief data officer (CDO) to meet data quality objectives by establishing essential data quality roles, starting with data stewards, and aligning them with the office of the CDO.
Key roles, such as data stewards or data quality analysts, are shifting from IT to either purely business or an IT-business hybrid combination. This indicates greater information management maturity and increasing recognition that data quality requires cross-organizational collaboration.
Annual spend on on-premises data quality tools remains high, with an average of $208,000 and a median of $150,000, which prevents more pervasive adoption of tools.
“There are several techniques that can be used to optimize the cost of data quality tools. One of the most important techniques is adopting a bimodal and adaptive sourcing approach,” Selvage says. “This improves sourcing flexibility and drives high business value, while maintaining fit-for-purpose governance.”
Many organizations overestimate the amount of time to deploy data quality tools by as much as double. Overestimation may seem harmless, but it creates unnecessary barriers to starting a data quality program, and generates distrust between the business and IT.
Businesses may opt for tools that they perceive as faster to deploy, but are potentially less effective. It can also generate distrust between data quality execution teams and management.
Join your peers for the unveiling of the latest insights at Gartner conferences.
Recommended resources for Gartner clients*:
How to Overcome the Top Four Data Quality Practice Challenges by Mei Yang Selvage.
*Note that some documents may not be available to all Gartner clients.