When you receive an email addressed to someone else, how does it affect your perception of the organization that sent it? If customers don’t trust the information held about them, overall confidence in the organization can be quickly eroded.
Poor data quality is also hitting organizations where it hurts – to the tune of $15 million as the average annual financial cost in 2017, according to Gartner's Data Quality Market Survey.
Learn more: Gartner BuySmart™ helps you make smarter tech purchasing decisions.
Mei Yang Selvage, research director at Gartner, says not only are organizations taking a financial hit, poor data quality practices undermine digital initiatives, weaken their competitive standing and sow customer distrust.
“On the other hand, innovative organizations like AirBnB and Amazon are using good quality data to allow them to know who their customers are, where they are and what they like,” says Selvage. “Good quality data empowers business insights and starts new business models in every industry. It allows enterprises to generate revenue by trading data as a valuable asset.”
4 steps to overcome data quality challenges
The good news is that you can strengthen your data quality practices by taking the following four steps.
1. Measure value
Nearly 60% of organizations don’t measure the annual financial cost of poor quality data, according to the Gartner survey. “Failing to measure this impact results in reactive responses to data quality issues, missed business growth opportunities, increased risks and lower ROI,” Selvage says.
“Leading information-driven organizations proactively measure the value of their information assets, as well as the cost of poor quality data and the value of good quality data,” Selvage says. “Most importantly, they link these directly to key business performance metrics.”
2. Establish the critical data quality roles
Many organizations lack the most essential data quality roles or don't know where to place them for optimal impact. This weakens the effectiveness of data quality programs.
To overcome this, help the chief data officer (CDO) to meet data quality objectives by establishing essential data quality roles, starting with data stewards, and aligning them with the office of the CDO.
Key roles, such as data stewards or data quality analysts, are shifting from IT to either purely business or an IT-business hybrid combination. This indicates greater information management maturity and increasing recognition that data quality requires cross-organizational collaboration.
3. Optimize cost of data quality tools
Annual spend on on-premises data quality tools remains high, with an average of $208,000 and a median of $150,000, which prevents more pervasive adoption of tools.
“There are several techniques that can be used to optimize the cost of data quality tools. One of the most important techniques is adopting a bimodal and adaptive sourcing approach,” Selvage says. “This improves sourcing flexibility and drives high business value, while maintaining fit-for-purpose governance.”
4. Estimate a realistic time frame to deploy data quality tools
Many organizations overestimate the amount of time to deploy data quality tools by as much as double. Overestimation may seem harmless, but it creates unnecessary barriers to starting a data quality program, and generates distrust between the business and IT.
Businesses may opt for tools that they perceive as faster to deploy, but are potentially less effective. It can also generate distrust between data quality execution teams and management.