Magic Quadrant for Data Quality Tools
25 June 2010

Ted Friedman, Andreas Bitterer

Gartner RAS Core Research Note G00200603

Demand for data quality tools remains strong, with deployments in support of master data management, modernization and information governance programs. Competition among large vendors grows more fierce as convergence between data quality tools, data integration tools and MDM solutions progresses.

What You Need to Know

The market for data quality tools continues to experience both substantial growth and volatility. Traditional use-cases of business intelligence (BI) and master data management (MDM) remain highly active, with new applications in support of information governance initiatives and broadening of deployments across multiple data domains and applications rapidly increasing. Large vendors in related markets continue to enter this space through acquisition of smaller or specialist providers (for example, the recent acquisitions of Silver Creek Systems by Oracle and Netrics by Tibco) and new vendors continue to emerge. The trend of convergence of the data quality tools market with related markets for data integration tools and MDM products continues, as market demand increasingly moves toward broader data management capabilities spanning all of these disciplines. This is also reflected in the vendor landscape, with a rapidly growing number of providers competing in more than one of these currently discrete markets.

When evaluating offerings in this market, organizations must consider not only the breadth of functional capabilities (for example, data profiling, parsing, standardization, matching, monitoring and enrichment) relative to their requirements, but also the degree to which this functionality can be readily understood, managed and leveraged by non-IT resources, as well as how readily it can be embedded in business process workflows. In keeping with significant trends in data management, business roles such as data stewards will increasingly be responsible for managing the goals, rules, processes and metrics associated with data quality improvement initiatives. Other key considerations include the degree of integration of the range of functional capabilities into a single architecture and product, and the available deployment options (traditional on-premises software deployment, hosted solutions and software as a service [SaaS] or cloud-based). Finally, given the current economic and market conditions, buyers must deeply analyze non-technology characteristics, such as pricing models and total cost footprint, as well as the size, viability and partnerships of the vendors.

Use this Magic Quadrant to understand the data quality tools market and how Gartner rates the leading vendors and their packaged products in that market. Draw on this research to evaluate vendors based on a customized set of objective criteria. Gartner advises organizations against simply selecting vendors in the Leaders quadrant. All selections are buyer-specific, and vendors from the Challengers, Niche Players or Visionaries quadrants could be better matches for your requirements.

Magic Quadrant

Figure 1. Magic Quadrant for Data Quality Tools

Figure 1.Magic Quadrant for Data Quality Tools

Source: Gartner (June 2010)

Market Overview

Organizations of all sizes and in all industries are recognizing the importance of high-quality data and the critical role of data quality in information governance and stewardship, driven by broader enterprise information management initiatives. As a result, their interest in the role of tools and technology for data quality improvement continues to grow. Fueled by a market of purpose-built, packaged tools for addressing various dimensions of the data quality discipline, data quality functionality is readily available from a variety of providers, both large and small. Data quality functionality is also being recognized as a fundamental component of offerings in many related software markets, such as data integration tools, MDM solutions and BI platforms. As a result, an increasing number of partnerships between MDM solution vendors and data quality tools vendors are occurring. Partnerships spanning the data integration tools and data quality tools markets have been commonplace for some time, and several recent acquisition events (Oracle's purchase of Silver Creek Systems and Tibco's purchase of Netrics) highlight this ongoing trend. These developments are further recognition that data quality capabilities are at the core of many different data management-related disciplines.

The vendors in the data quality tools market offer a broad range of functionality, from data quality analysis, profiling and monitoring, to fundamental data cleansing operations such as parsing, standardization and matching, through to data enrichment. Much convergence and integration of technology has occurred, and today vendors offer more functionality within a smaller number of discrete products — most vendors have consolidated the bulk of their core data quality functionality (the fundamental elements of parsing, standardization, matching, and cleansing) into a single data quality platform, with data profiling remaining the only major functional component commonly sold as a separate product. However, specialized add-on capabilities (such as global name and address support, application-specific knowledge bases and dashboards for data quality metrics) are commonplace, and even grow in number, as vendors adapt their packaging and pricing models to suit a wider range of potential buyers.

New market entrants and longtime competitors are delivering technology with a focus on data quality analysis, pervasive deployment of data quality controls, ongoing data quality monitoring and the flexibility to address a range of data subject areas. Market demand has substantially shifted toward an intent for multidomain deployments, as more organizations report that their data quality improvement efforts are no longer focused on a single data subject area. They seek multidomain-capable technology when they are evaluating options in the market.

The market for data quality tools was worth approximately $727 million in software-related revenue as of the end of 2009, and is forecast to experience a compound annual growth rate (CAGR) of 12% during the next five years. This is a result of the significant attention that organizations are focusing on various data-related initiatives such as information governance and IT modernization, as well as the more recent and rapidly growing interest in the topic within industries that are less mature from a data management perspective (such as government and other segments of the public sector). In addition, most organizations have significant investments in "below the radar" data quality activities — both manual and custom-coded — within the context of their data migration, MDM and application integration approaches. These scenarios represent opportunities for modernization with packaged data quality tools. While the past 12 months showed continued strong demand for specialized data quality capabilities, such as address standardization/validation and matching, those providers' broader tool suites and the ability to address quality issues in various data domains saw increasing traction. Many new entrants focus on "domain-agnostic" data quality services (stand-alone or embedded in applications), based on a centrally managed set of business rules. However, with the increasing trend toward embedding data quality capabilities in business applications, data integration tools and other software offerings from larger vendors, these small competitors will face significant challenges as they attempt to survive and grow.

This market comprises a diverse set of vendors approaching the data quality tools opportunity from different directions and backgrounds. Large applications and infrastructure technology providers, such as IBM and SAP BusinessObjects, increasingly focus on data quality capabilities as complementary to various components of their portfolios. While they sell data quality tools in a stand-alone manner (as individual products), these tools are increasingly sold as part of a larger transaction involving related products (such as data integration tools and MDM solutions). While Oracle and Microsoft have both recently begun to address this market via acquisitions, their market presence is currently very limited. Oracle has just begun actively selling the acquired technology as a complementary add-on for its product MDM solution, while Microsoft will be delivering its technology to customers for the first time as part of the next major release of the SQL Server database management system (DBMS). Other large technology and services providers manage data quality-focused divisions, such as SAS Institute (with its DataFlux subsidiary), Pitney Bowes (with its Business Insight division) and Harte-Hanks (with its Trillium Software division). Specialists focused on data management capabilities, such as Informatica (and other data integration tools vendors not directly positioned on the Data Quality Tools Magic Quadrant), have added data quality capabilities to their portfolios, either via acquisitions or organic development. This reflects the increasing overlap between the markets for data integration tools and data quality tools. Finally, a large number of pure-play specialist data quality tools vendors, including Datactics, DataLever, DataMentors, Datanomic, Human Inference, Innovative Systems and Uniserv (and many others not positioned on the Magic Quadrant because they do not meet the inclusion criteria) vie for deals in stand-alone data quality tools. Most of these specialists are small (with annual revenue of less than $100 million), and may be vulnerable to the challenging economic conditions and mounting competitive pressure from the larger vendors.

Another significant development is related to licensing approaches and delivery models for data quality tools. The open-source movement has reached the data quality tools market, with a few small projects rising to the surface. Organizations with a need for data quality capabilities, such as profiling, cleansing, matching or enrichment, should not expect deep functionality and should stick with commercial offerings for critical production implementations. All open-source data quality projects combined will reach just 3% to 5% market penetration (subscribed customers) by 2012. It will likely be well beyond 2012 before open-source data quality platforms have broadly caught up with the commercial data quality tools vendors in terms of their capabilities. In addition, there is growing interest in off-premises deployment models, such as SaaS and cloud-based models. More vendors in the market have begun to offer aspects of their functionality via these channels, attempting to capture the market demand for rapid, flexible and low-cost deployments of focused data quality functionality.

Market Definition/Description

The data quality tools market comprises vendors that offer stand-alone software products to address the core functional requirements of the data quality discipline:

  • Profiling. The analysis of data to capture statistics (metadata) that provide insight into the quality of the data and help to identify data quality issues.
  • Parsing and standardization. The decomposition of text fields into component parts and the formatting of values into consistent layouts based on industry standards, local standards (for example, postal authority standards for address data), user-defined business rules and knowledge bases of values and patterns.
  • Generalized "cleansing." The modification of data values to meet domain restrictions, integrity constraints or other business rules that define when the quality of data is sufficient for the organization.
  • Matching. Identifying, linking or merging related entries within or across sets of data.
  • Monitoring. Deploying controls to ensure that data continues to conform to business rules that define data quality for the organization.
  • Enrichment. Enhancing the value of internally held data by appending related attributes from external sources (for example, consumer demographic attributes or geographic descriptors).

In addition, these products provide a range of related functional capabilities that are not unique to this market but which are required to execute many of the data quality core functions, or for specific data quality applications:

  • Connectivity/adapters. The ability to interact with a range of different data structure types.
  • Subject-area-specific support. Standardization capabilities for specific data subject areas.
  • International support. The relevance for data quality operations on a global basis.
  • Metadata management. The ability to capture, reconcile and interoperate metadata related to the data quality process.
  • Configuration environment. Capabilities for creating, managing and deploying data quality rules.
  • Operations and administration. Facilities for supporting, managing and controlling data quality processes.
  • Workflow/data quality process support. Processes and user interfaces for various data quality roles, such as data stewards.
  • Service enablement. Service-oriented characteristics and support for service-oriented architecture (SOA) deployments.

The tools provided by vendors in this market are generally consumed by technology users for internal deployment in their IT infrastructure. However, off-premises solutions in the form of hosted data quality offerings and SaaS delivery models are continuing to evolve and grow in popularity.

Inclusion and Exclusion Criteria

For vendors to be included in the Magic Quadrant, they must meet the following criteria:

  • They must offer stand-alone (not only embedded in, or dependent on, other products and services) packaged software tools that are positioned, marketed and sold specifically for general-purpose data quality applications.
  • They must deliver functionality that addresses, at a minimum, profiling, parsing, standardization, cleansing and matching. Vendors that offer narrow functionality (for example, they only address cleansing and validation or only deal with matching) are excluded because they do not provide complete suites of data quality tools. Specifically, vendors must offer all the following:
    • Profiling and visualization — provide packaged functionality for attribute-based analysis (for example, minimum, maximum, frequency distribution) and dependency analysis (cross-table and cross-data set analysis). Profiling results must be exposed in a either a tabular or graphical user interface delivered as part of the vendor's offering. Profiling results must be able to be stored and analyzed across time boundaries (trending).
    • Parsing — provide packaged routines for identifying and extracting components of textual strings, such as names, mailing addresses and other contact-related information. Parsing algorithms and rules must be applicable to a wide range of data types and domains, and must be configurable and extensible by the customer.
    • Matching — provide configurable matching rules or algorithms that enable customers to customize their matching scenarios, audit the results and tune the matching scenario over time. The matching functionality must not be limited to specific data types and domains, nor limited to the number of attributes that can be considered in a matching scenario.
    • Standardization and cleansing — provide both packaged and extensible rules for handling syntax (formatting) and semantic (values) transformation of data to ensure conformance with business rules.
  • They must support this functionality for data in more than one language and specific to more than one country.
  • They must maintain an installed base of at least 75 production, maintenance/subscription-paying customers for the data quality product(s) meeting the above functional criteria.
  • They must support broad-scale deployment via server-based runtime architectures.
  • They must provide at least 10 responsive customer references (during the customer survey executed as part of the Magic Quadrant research process) that demonstrate multidomain and/or multiproject use of the product(s) meeting the above functional criteria.

Vendors meeting the above criteria but limited to deployment in a single specific application environment, vertical industry or data domain are excluded from this market. A vendor that does not meet the above criteria may be considered for inclusion if it is a new entrant that is demonstrably different from established vendors, and which represents a future direction for data quality tools.

There are many data quality tools vendors but most do not meet the above criteria and are, therefore, not included in the Magic Quadrant. Many vendors provide products that deal with one very specific data quality problem, such as address cleansing and validation, but which cannot support other types of application, or lack the full breadth of functionality expected of today's data quality solutions. Others provide a range of functionality, but operate only in a single country or support only narrow, departmental implementations. Others may meet all the functional, deployment and geographic requirements but are at a very early stage in their "life span" and, therefore, have few, if any, production customers. The following vendors may be considered by Gartner clients alongside those appearing in the Magic Quadrant when deployment needs are aligned with their specific capabilities; or they are newer entrants beginning to gain visibility in the market but which lack a significant customer base:

  • Acme Data (formerly Stalworth), San Mateo, California, — offers a platform for cleansing and augmenting customer data (companies, contacts, international addresses, phone numbers, geocoding), and matching and merging customer records.
  • ActivePrime, Pasadena, California, — provides on-demand data cleansing and deduplication for CRM applications, such as, Siebel or SalesLogix.
  • Acuate, London, U.K., — provides products for the standardization, matching and merging of various data types, as well as data quality professional services.
  • AddressDoctor, Maxdorf, Germany, — specializes in international address standardization and validation, supporting 240 countries and territories.
  • Alteryx, Orange, California, — provides data cleansing in the context of BI applications with a customer and geographic data orientation.
  • AMB, Chicago, Illinois, — provides profiling, standardization and cleansing functionality for deployment in Windows environments.
  • Anchor Software, Plano, Texas, — provides a range of data quality utilities supporting common customer list management operations such as file splitting, deduplication and suppression.
  • Ataccama, Prague, Czech Republic, — the Data Quality Center product provides support for data quality analysis, data cleansing and governance of data quality business rules.
  • Axesor, Madrid, Spain, — provides standardization, deduplication and geocoding for database marketing.
  • BackOffice Associates, South Harwich, Massachusetts, — offers services and technology with a focus on migration and governance of master data within SAP and other packaged applications.
  • BCC Software (a division of Bowe Bell + Howell), Rochester, New York, — provides a range of data quality utilities supporting common customer list management operations, such as address validation, change of address, deduplication and suppression.
  • Business Data Quality, London, U.K., — offers products focused on data profiling (BDQ Analysis) and data quality monitoring (BDQ Monitor).
  • Caatoosee, Leonberg, Germany, — provides data cleansing for SAP applications through its DQaddress product and generic matching and deduplication with DQworkbench.
  • Certica Solutions, Wakefield, Massachusetts, — provides products that focus on validating data against predefined data quality rules.
  • Ciant, Richardson, Texas, — provides parsing, standardization and matching functionality for customer data, in support of sales and marketing analytics.
  • Clavis Technology, Dublin, Ireland, — provides its data quality governance solution (Clavis Data Steward), which supports the deployment of data quality controls for preventing data entry errors, in a SaaS model.
  • Data8, Ellesmere Port, U.K., — provides a free online service for data cleansing, postcode lookup and data validation.
  • DataQualityApps, Untermeitingen, Germany, — provides Windows-based tools for parsing, matching, deduplication and standardization of addresses.
  • Datiris, Lakewood, Colorado, — provides various data profiling techniques for a range of data sources.
  • Datras, Munich, Germany, — focuses on the German-speaking markets, providing profiling, standardization and monitoring capabilities.
  • Deyde, Las Matas, Madrid, Spain, — specializes in name and address database optimization.
  • DQ Global, Fareham, U.K., — provides matching, deduplication and international address standardization and validation functionality.
  • Eprentise, Orlando, Florida, — offers a rule-based data quality engine for standardization, merging and deduplication.
  • FinScore, Renens, Switzerland, — offers technology for measuring data quality and presenting metrics in a dashboard form.
  • Global IDs, New York, New York, — the Data Quality and Verification product suite supports the ability measure data quality metrics and monitor the quality of data over time.
  • helpIT systems, Leatherhead, U.K., — provides data quality tools oriented toward customer matching, deduplication and suppression operations.
  • Hopewiser, Altrincham, U.K., — provides address cleansing, verification, deduplication and enrichment for mass mailing.
  • Infogix, Naperville, Illinois, — provides controls-based technology for auditing and validating the integrity of data within and across systems.
  • Infoshare, Kingston upon Thames, U.K., — provides data quality solutions for local and central government.
  • Infosolve Technologies, South Brunswick, New Jersey, — provides open-source tools (with required service contract) that support profiling, standardization, matching and deduplication operations.
  • InQuera, Migdal Tefen, Israel, — specializes in technology for standardization, matching and deduplication, with a specific focus on product data.
  • Intelligent Search Technology, White Plains, New York, — develops products for profiling, matching, deduplication and U.S. address correction.
  • iWay Software, New York, New York, — provides data profiling and data cleansing functionality via its iWay Data Profiler and iWay Data Quality Center products.
  • Ixsight, Mumbai, India, — offers services for data quality audits, along with products for standardization and deduplication.
  • Kroll-Software, Altdorf, Switzerland, — provides deduplication software, both as a packaged FuzzyDupes product as well as toolkit for software developers.
  • Mastersoft Research, Sydney, Australia, — offers the Harmony Data Quality product, which uses a reasoning engine approach to matching, linking and other data quality operations.
  • Melissa Data, Rancho Santa Margarita, California, — supports standardization of names, addresses and phone numbers, and validation of addresses and phone numbers (both via on-premises software and hosted Web services).
  • Omikron, Pforzheim, Germany, — provides products for standardization and deduplication of customer name and address data.
  • Oracle, Redwood Shores, California, — via the acquisition of Silver Creek Systems, provides parsing, standardization and matching functionality, with a focus on product data applications.
  • Posidex Technologies, Hyderabad, India, — the PrimeMatch Data Quality System provides data profiling, standardization, identity resolution, monitoring and enrichment.
  • QAS (a subsidiary of Experian), London, U.K., — offers global name and address standardization, validation and matching/deduplication functionality.
  • Runner Technologies, Boca Raton, Florida, — provides a development component for verifying and standardizing addresses for Oracle database applications.
  • Satori Software, Seattle, Washington, — provides name and address data cleansing as part of its Mailroom Toolkit address management tools.
  • Scarus, Ludwigshafen, Germany, — offers the IntelliCleaner suite of products, for parsing, deduplication and standardization functionality, with a focus on name and address data.
  • Sigma Data Services, Alcorcón, Madrid, Spain, — provides data profiling, normalization and deduplication of names, addresses and phone numbers.
  • Spad, Paris, France, — offers a suite of data quality products for data profiling, monitoring and standardization.
  • SQL Power, Toronto, Canada, — provides open-source tools supporting standardization, address validation and deduplication.
  • Talend, Suresnes, France, — provides open-source products for data profiling, cleansing and enrichment.
  • TIQ Solutions, Leipzig, Germany, — provides data profiling and data quality dashboards, with a focus on the banking, insurance and distribution verticals.
  • Utopia, Mundelein, Illinois, — offers services and technology for data quality analysis and data standardization, with a focus on product master data.
  • Veda Advantage, Sydney, Australia, — provides software to cleanse and update customer addresses, add phone numbers, merge databases into a single customer view and append segmentation data.
  • WinPure, Reading, U.K., — offers low-cost data cleansing, matching and data deduplication software on the Windows platform.
  • X88 Software, Reading, U.K., — provides data profiling, cleansing, and standardization along with discovery and analysis tools with its Pandora product.

Gartner will continue to monitor the status of these vendors for possible inclusion in future updates of the Magic Quadrant for Data Quality Tools.


No new vendors have been added to the Magic Quadrant since the previous version.


Netrics was removed from the Magic Quadrant because it no longer meets the functional inclusion criteria — specifically, the sole focus on matching and the lack of data profiling and visualization capabilities does not address the full breadth of requirements in this market.

Evaluation Criteria

Ability to Execute

Gartner analysts evaluate technology providers on the quality and efficacy of the processes, systems, methods or procedures that enable IT providers' performance to be competitive, efficient and effective, and to positively affect revenue, retention and reputation. Ultimately, technology providers are judged on their ability to capitalize on their vision, and their success in doing so.

We evaluate vendors' ability to execute in the data quality tools market by using the following criteria:

  • Product/Service. How well the vendor supports the range of data quality functionality required by the market, the manner (architecture) in which this functionality is delivered, and the overall usability of the tools. Product capabilities are critical to the success of data quality tool deployments and, therefore, receive a high weighting.
  • Overall Viability. The magnitude of the vendor's financial resources and the strength of its people and organizational structure. In this iteration of the Magic Quadrant we have retained a high weighting for this criterion to reflect buyers' continuing concerns over the risk associated with vendors as a result of current economic conditions.
  • Sales Execution/Pricing. The effectiveness of the vendor's pricing model and the effectiveness of its direct and indirect sales channels.
  • Market Responsiveness and Track Record. The degree to which the vendor has demonstrated the ability to respond successfully to market demand for data quality capabilities over an extended period.
  • Marketing Execution. The overall effectiveness of the vendor's marketing efforts, and the degree of "mind share," market share and account penetration the vendor has achieved as a result. Given the increasingly competitive nature of this market and the constant entry of new vendors, both large and small, we increased the weighting of this criterion in this iteration of the Magic Quadrant.
  • Customer Experience. The level of satisfaction expressed by customers regarding the vendor's product support, professional services, and overall relationship with the vendor, as well as the customers' perceptions of the value of the vendor's data quality tools relative to costs and expectations. In this iteration of the Magic Quadrant we have again increased the weighting of this criterion to reflect the substantially greater scrutiny that buyers are placing on these considerations as a result of economic conditions and budgetary pressures. Analysis and rating of vendors against this criterion are driven directly by results of the customer survey executed as part of the Magic Quadrant process.

Table 1 gives our weightings for the Ability to Execute evaluation criteria.

Table 1. Ability to Execute Evaluation Criteria

Evaluation Criteria
Overall Viability (Business Unit, Financial, Strategy, Organization)
Sales Execution/Pricing
Market Responsiveness and Track Record
Marketing Execution
Customer Experience
no rating

Source: Gartner (June 2010)


Completeness of Vision

Gartner analysts evaluate technology providers on their ability to convincingly articulate logical statements about current and future market direction, innovation, customer needs and competitive forces, as well as how they map to the Gartner position. Ultimately, technology providers are assessed on their understanding of the ways that market forces can be exploited to create opportunities.

We assess vendors' completeness of vision for the data quality tools market by using the following criteria:

  • Market Understanding. The degree to which the vendor leads the market in new directions (technology, product, services or otherwise), and its ability to adapt to significant market changes and disruptions. Given the dynamic nature of this market, this item receives a high weighting.
  • Marketing Strategy. The degree to which the vendor's marketing approach aligns with and/or exploits emerging trends and the overall direction of the market.
  • Sales Strategy. The alignment of the vendor's sales model with the way that customers' preferred buying approaches will evolve over time.
  • Offering (Product) Strategy. The degree to which the vendor's product road map reflects demand trends in the market, fills current gaps or weaknesses, and includes developments that create competitive differentiation and increased value for customers. We also consider the breadth of the vendor's strategy regarding a range of delivery models for products and services, from traditional on-premises deployment to SaaS and cloud-based models.
  • Business Model. The overall approach the vendor takes to execute its strategy for the data quality market. With a reasonably high degree of similarity across the vendors in this market, this item receives a low weighting.
  • Vertical/Industry Strategy. The level of emphasis the vendor places on vertical solutions, and the vendor's depth of vertical expertise. Given the broad cross-industry nature of the data quality discipline, vertical strategies are less critical, so this item receives a low weighting.
  • Innovation. The degree to which the vendor has demonstrated a willingness to make new investments to support its strategy and enhance its product capabilities, the level of investment in R&D directed toward development of the tools, and the extent to which the vendor demonstrates creative energy.
  • Geographic Strategy. The global presence of the vendor and the manner in which it is achieved (for example, direct local presence, resellers and distributors), and the vendor's strategy for expanding its reach into markets beyond its home region/country.

Table 2 gives our weightings for the Completeness of Vision evaluation criteria.

Table 2. Completeness of Vision Evaluation Criteria

Evaluation Criteria
Market Understanding
Marketing Strategy
Sales Strategy
Offering (Product) Strategy
Business Model
Vertical/Industry Strategy
Geographic Strategy

Source: Gartner (June 2010)



Leaders in the market demonstrate strength across a complete range of data quality functionality, including profiling, parsing, standardization, matching, validation and enrichment. They exhibit a clear understanding and vision of where the market is headed, including recognition of non-customer data quality issues and the delivery of enterprise-level data quality implementations. Leaders have an established market presence, significant size and a multinational presence (directly or as a result of a parent company).


Challengers in the market provide strong product capabilities but may not have the same breadth of offering as Leaders. For example, they may lack several of the functional capabilities of a complete data quality solution. Challengers have an established presence, credibility and viability, but may demonstrate strength only in a specific domain (for example, only customer name and address cleansing), and/or may not demonstrate a significant degree of thought leadership and innovation.


Visionaries in the market demonstrate a strong understanding of current and future market trends and directions, such as the importance of ongoing monitoring of data quality, engagement of business subject matter experts and delivery of data quality services. They exhibit capabilities aligned with these trends, but may lack the market presence, brand recognition, customer base and resources of larger vendors.

Niche Players

Niche Players often have limited breadth of functional capabilities and may lack strength in rapidly evolving functional areas such as data profiling and international support. In addition, they may focus solely on a specific market segment (such as midsize businesses), limited geographic areas or a single domain (such as customer data), rather than positioning themselves toward broader use. Niche Players may have good functional breadth but may have an early-stage presence in the market, with a small customer base and limited resources. Niche Players that specialize in a particular geographic area or data domain may have very strong offerings for their chosen focus area and deliver substantial value for their customers in that segment.

Vendor Strengths and Cautions


Belfast, U.K.,

Products: Datactics v4 (formerly known as DataTrawler), Data Quality Manager

Customer base: 100 (estimated)

  • Datactics is a data quality vendor that operates primarily in Europe. It also operates a sales office in the U.S. and continues to maintain a number of value-added resellers (VARs) in the Americas and Asia. Its software is used in a range of subject areas, beyond the typical customer data validation scenarios, and many references appreciate the ease of use of the Datactics platform, particularly for non-technical staff.
  • The company's flagship product, Datactics v4, is fully 64-bit and Unicode-enabled, supports most European languages, runs on many platforms and supplies broad capabilities in profiling, matching/merging, cleansing and monitoring. Data quality scorecards can be constructed to monitor quality-related metrics. Data Quality Manager enables Datactics data quality functions to be exposed as services and orchestrated into workflows driven by business users. Most of Datactics' reference customers are small and midsize businesses, with a focus on the manufacturing sector, as well as government agencies. Reference customers use the Datactics product mostly in MDM, system migration, and embedded into business applications.
  • Datactics has partnerships with consultancies and system integrators (SIs) outside its U.K. base that have used the Datactics v4 product in some strategic data quality programs. The vendor's own professional services and support function has received accolades from the surveyed reference customers.

  • Under the new Datactics management, and after a successful investment round, a new sales and marketing strategy is unfolding, but the additional funds of approximately £1.8 million will not be enough to allow it to leapfrog over more financially stable competitors. With only six sales employees, limited marketing budgets and relatively low-profile partnerships, Datactics is "flying underneath the radar" for most organizations looking for a provider of data quality tools.
  • Datactics continues to invest most of its funds into technology enhancements, such as new algorithms, or a new concept named "predictive data quality" (patent pending). However, Datactics is largely bypassing the overarching industry trends, such as data integration/data quality convergence and SaaS or cloud-based deployment and delivery models, while many of its competitors are actively expanding in these areas.
  • Although Datactics has signed up VARs in markets such as Brazil, Hong Kong and Turkey, there is no traction to report in those regions, and all major sales or partnering opportunities remain mostly in English-speaking countries. Datactics must build a stronger independent software vendor (ISV) partner network to establish itself in new markets and attract new customers.


Cary, North Carolina, U.S.,

Products: dfPower Studio, DataFlux Integration Server, DataFlux Accelerators

Customer base: 2,300 (estimated)

  • DataFlux continues to drive broad data quality initiatives, from BI and data warehousing to MDM and migration, and is showing impressive growth. DataFlux has added a significant number of net new customers, and its growth also benefits from Project Unity, through which the vendor inherits data integration customers from SAS, the parent company. DataFlux has one of the highest ratios of reinvesting revenue in R&D and enjoys a maintenance renewal rate of over 95%. Project Unity is consistent with the ongoing market trend of the convergence of data quality, data integration and master data management.
  • DataFlux's capabilities include profiling, matching, cleansing, monitoring and metadata management in a single platform. The ability to run SAS code within a DataFlux data flow is a unique strength of the DataFlux platform. New platform developments with version 2.1 of the DataFlux Data Management Platform include the integration of the DataFlux Federation server, a new collaboration environment for data stewards, and new data quality assessment and visualization capabilities. The vendor is continuously pushing the boundaries of its data quality platform and the market in general, leading to its favorable position again this year.
  • The vendor continues to build out "accelerators," such as Customer Data Analysis or Materials Data Classification, and is praised by its customer references for the usability of its tools (particularly for non-technical staff), their easy installation and the integration of the toolset. Technical support and professional services continue to be ranked among the highest in the customer survey.

  • While Project Unity is slowly getting recognized in the market, DataFlux still doesn't have the brand recognition of a data management infrastructure provider beyond data quality. DataFlux will need to expand its marketing scope to gain recognition beyond the area of data quality, and to compete with much larger infrastructure opponents, such as IBM, SAP or Informatica.
  • While DataFlux tools are reported to be easy to use, due to the breadth of functionality, some customers report that the learning curve is fairly steep and skilled personnel are difficult to find. Customers would, in particular, like to see more usage examples and "how to" documentation with tips and tricks. DataFlux customers continue to struggle with the overall high price of the software.
  • The majority (88%) of recent customer references work in customer data (for example, name and address cleansing and matching), while other domains represent a substantial minority. However, examples of multidomain use are common across the DataFlux customer base.


Boulder, Colorado, U.S.,

Products: DataLever

Customer base: 150 (estimated)

  • DataLever provides support for the core requirements of data quality, providing integrated data-profiling and data-cleansing functionality in a single product. All operations can be readily deployed in both batch and real-time modes. The vendor has focused on delivering the fundamental capabilities required in virtually all data quality projects (such as parsing, standardization and cleansing, and more recently Unicode support), rather than attempting to expand the scope of the data quality discipline or innovate in new functional areas.
  • DataLever takes a domain-agnostic view of data quality issues, enabling its technology to be applied in various data domains, including customer and product. While most of its installed base applies DataLever's technology to customer data quality issues, customer references reflect a healthy percentage of implementations in other areas. In addition, the customer base is active in applying the technology to a range of use-cases, from BI and data warehousing to data migrations, MDM and others.
  • Customers cite overall ease of use, relatively short implementation times and the lower cost compared with alternative offerings as the main selling points of DataLever's products. The attractive cost footprint is well suited to the current economic and market conditions. Strong performance in scenarios with large data volumes, as demonstrated by customer references, is helping DataLever to succeed in competitive situations. In addition, the relatively low complexity of the product means that it can be used by business subject matter experts, as well as IT personnel. As a result of these characteristics, the vendor's mind share in the market is slowly increasing, but only in North America.

  • As one of the smaller and privately held providers in the market, DataLever supports a small customer base, with very limited presence outside North America as compared to its competitors. DataLever's technology has traditionally been adopted mostly by midsize businesses, and the most recent set of customer references was heavily oriented (over 65%) toward organizations with 1,000 or fewer employees. However, the vendor claims to be increasingly attracting large enterprises and executing much larger deals ($500,000 and more).
  • To date, DataLever has focused solely on the on-premises deployment of its software, although the vendor states it is developing SaaS capabilities and determining the pricing model around them. The vendor does not articulate a clear product road map with solid milestones, detracting from its ability to present a strong vision and understanding of the market.
  • The vendor's lack of significant partnerships with SIs and complementary software vendors will limit its competitive strength — this represents a substantial challenge in the current market conditions, where buyers perceive greater risk in smaller vendors. This is reflected in prospect feedback, in which concerns about market presence, lack of skills availability, and limited recognition and credibility were seen as the most common reasons for disqualifying DataLever from consideration.


Wesley Chapel, Florida, U.S.,

Products: DataFuse, ValiData, NetEffect

Customer base: 100 (estimated)

  • DataMentors specializes in customer data quality applications, providing matching, linking, standardization and cleansing operations via its DataFuse product (and the real-time version called NetEffect), and data profiling capabilities via ValiData. Its partnership with smartFocus enables the vendor to offer campaign management, analytics and mapping capabilities (branded as DataMentors' PinPoint). The vendor's roots are in database marketing, with the management team having been involved in large-scale applications of this type for more than 20 years.
  • Customer references are predominantly in the financial services vertical, although the vendor is increasing its focus on the healthcare, hospitality and publishing industries. Customers cite accuracy of matching, ease of use and attractive pricing relative to some of the more prominent vendors in the market as key functional strengths. Coupled with a perception of the vendor's deep industry experience, these are main reasons for their selection of DataMentors' technology. The most significant milestone in the product road map is the delivery of DataFuse v6.0, planned for 1Q11. This version will focus largely on enterprise-scale capabilities such as multiserver deployments, enhanced security models and cloud-based delivery.
  • The vendor's customer base reflects a higher percentage of hosted (SaaS) implementations than is seen for any other vendor in this market. DataMentors estimates that more than half its customers are using its technology in a hosted manner and that nearly all new customers are deploying the technology in a SaaS model. This is reflected in the vendor's customer references, with over 60% reporting off-premises deployments.

  • DataMentors' 2009 revenue and customer base growth were below market averages. With a small installed base and limited resources for marketing, the vendor will be challenged to gain mind share in a market increasingly populated by much larger providers. This was apparent in a cross-industry sample of over 250 data quality tools buyers, in which only 2% considered DataMentors in competitive evaluations. In addition, while the vendor's attractive cost model and ease of use are well suited to market demand, as one of the smallest competitors in this market it will face challenges as the current economic conditions increase buyers' desire for large providers with extensive financial resources.
  • DataMentors' has chosen to focus primarily on customer data quality issues and satisfying business user demand for rapid deployment of solutions. While these areas represent current opportunity, the narrower focus relative to larger competitors could place DataMentors at a competitive disadvantage. The vendor's customer references reflect an increasing number of examples where the technology is used in product data quality and financial data quality applications, and the vendor acknowledges growing demand in these areas. DataMentors' product road map largely consists of technical enhancements, lacking vision for developing areas of the data quality discipline such as data quality visualization, data stewardship and data quality policy management.
  • From a product functionality perspective, DataMentors has weaknesses in runtime platform support (Windows is the only deployment option, although DataFuse can interact with applications and data sources on other platforms), and the vendor plans to address the current lack of Unicode support in mid-2010 with the release of DataFuse v5.2. Customer references reflect very limited usage in real-time scenarios and few examples of multiproject or enterprisewide deployment.


Cambridge, U.K.,

Products: dn:Director

Customer base: 160 (estimated)

  • Datanomic continues to establish itself in the European data quality tools market and is demonstrating impressive growth while being profitable. The vendor now has more than 150 customers, most of which are in the U.K., including some major accounts that use Datanomic as their enterprise solution. Besides the European market traction, Datanomic is expanding into North America, where the vendor has opened its first office in New York, and in Asia, where Datanomic has signed a distributor agreement.
  • The Web services capability enables dn:Director users to rapidly deploy data quality components, such as matching or cleansing, into SOA environments. Datanomic has also released new extension packs, for customer data and sanctions and politically exposed persons (PEPs), enabling customers to speed up the time to production. Datanomic has also released a new processor to check external business rules, a new case management application for matching with associated workflows, and added integration with Lightweight Directory Access Protocol (LDAP) and Active Directory.
  • About three-quarters of Datanomic's customers come from the financial and telecommunications industries and the public sector, and the vendor has a strong focus on those areas. Datanomic products are domain-agnostic and not specifically targeted at customer data, and while being the dominating focus, more surveyed customer references reported a focus on non-customer data. At the same time, customer references indicate a very high satisfaction with the professional services and support from Datanomic, and cite ease of use as one of dn:Director's particular strengths.

  • Although dn:Director is built on an SOA, customer references describe the product as hard to integrate into other environments. Hardly any references report using the product outside customer/party data domains and address cleansing.
  • Although dn:Director is built in a services fashion, Datanomic has not visibly started to offer its data quality solution in a SaaS model. Almost all customer references indicate that they installed Datanomic's products on-premises. Also, the built-in data quality dashboard does not seem to live up to customer expectations, as references describe it as "insufficiently developed" and of "limited functionality."
  • Datanomic has been unable to capitalize on the international reach of its SI partners, some of which are very large, leaving it with virtually no visibility outside its home market in the U.K. In addition, Datanomic's relatively small size and market presence remain significant challenges in the face of economic conditions in its home market and increasing competitive pressure from much larger application and infrastructure providers.

Human Inference

Arnhem, The Netherlands,

Products: HIquality Suite, HIquality Name Worldwide, HIquality Identify, HIquality Data Improver

Customer base: 275 (estimated)

  • Human Inference provides data quality solutions to customers primarily in the European financial services industry, but also has clients in the public sector and telecommunications. Human Inference is a well-known brand for data quality software in Europe, particularly in its core markets in Benelux and Germany, and the vendor has started to expand further into the Nordic and Iberia regions with new offices in Stockholm and Madrid.
  • The components of the HIquality product set include technology for inspection and profiling, name and address cleansing, matching, merging and enrichment. One of Human Inference's key differentiators, described as a major strength by reference customers, is that it maintains reference datasets, which are available for select countries and which serve as knowledge bases for names, addresses, cultures and other specific meanings from a variety of contexts. Human Inference has started to focus on provisioning data quality through SaaS, which makes HIquality more attractive as an embedded component in business processes.
  • A large portion of Human Inference's customer base has gone beyond batch processing; now, real-time matching, address validation and cleansing are the top use-cases reported by the reference customers. Human Inference's reference customers show a nice diversity of data quality use-cases, from operational applications and information governance to MDM, BI and data migration.

  • Although Human Inference provided "update packs" for users of versions 4 and 5 of the software, a high ratio of reference customers had not upgraded to the latest available release of HIquality. Some customers reported a reluctance to migrate to the latest version of the product because of complexity in their current versions of the platform and the vendor's attempt to sell full bundles at a higher cost. A high number of reference customers continue to struggle with finding skilled service personnel, to help with the reportedly difficult product configuration in particular. However, customers that have moved to the latest version report improvements in ease of use. The other main critique point is the overall software pricing, which is perceived as very high.
  • Human Inference's partner channel strategy does not yet show a lot of traction, though the vendor has formed a dedicated channel sales team. The number of OEM and reseller partnerships with SIs and ISVs continues to be very small, as the vendor relies heavily on its own direct sales channel. While Human Inference still has a stronghold in its core geography, particularly the Benelux countries, it will experience greater competitive pressure from the large infrastructure vendors.
  • While the market leaders continue to expand their focus of data quality use-cases beyond the customer domain, Human Inference deployments seem to remain in this area, according to reference customers.


Armonk, New York, U.S.,

Products: Information Analyzer, QualityStage

Customer base: 2,000 (estimated)

  • IBM has successfully embedded its data quality products portfolio into its broader Information On Demand message. By promoting IBM's platform vision, ubiquitous data quality functionality becomes a key component of the information management portfolio. Backed by one of the world's best-known brands and strong sales, consulting, service and support functions, IBM approaches the data quality space in a very holistic manner.
  • Information Analyzer (discovery, profiling and analysis) and QualityStage (parsing, standardization and sophisticated matching) continue to be positioned as enterprisewide data quality standards, and are being used in multiple projects in customer organizations. IBM's customers have started to use its data quality products in multiple data domains, beyond customer data, and the company is showing among the greatest diversity of data domains in this roundup. It has started to leverage the adjacent product sets (for example, from Cognos and Exeros) for its Infosphere portfolio, in particular in the data governance area.
  • Reference customers report high satisfaction with the scalability and performance of the solution. Also, customers praised the intuitive user interface of the data quality products and the integrated nature of the solution across the various modules, including profiling, matching, cleansing and metadata management.

  • IBM's Information On Demand message, and the newer "information agenda" and "smarter planet" themes have not demonstrated a direct relevance and have the potential to dilute the focus on data quality. While data quality is part of the overall message and IBM initiated a data quality community with its Data Governance Council, mind share in the market grows relatively slowly. In particular, for organizations that want to focus on data quality as a separate initiative to solve a specific problem, the grand Information On Demand theme is likely to be seen as overkill. Still, in large enterprise deals, particularly those led by the IBM consulting and services organization, IBM's data quality products are always a contender.
  • The high price points of IBM's products relative to its competition represent a challenge for IBM. Customer references reported fairly low satisfaction with the pricing model and relative value of the products. Some customers expressed dissatisfaction with the pricing of individual modules, and with the lack of availability of qualified resources for implementation and support.
  • Although reference customers like the integrated nature of IBM's data quality product set, they often struggle integrating IBM's products with third-party components. The complexity of the overall system is mentioned as a difficulty, also indicated by the fact that just under half the referenced customers are running the latest version of IBM's data quality products.


Redwood City, California, U.S.,

Products: Data Explorer, Data Quality

Customer base: 1,000 (estimated)

  • Informatica has established itself as a strong provider of data quality solutions in the market with strong growth figures, particularly in Europe, the Middle East and Africa (EMEA) and Asia/Pacific. The vendor added a significant number of large data quality deals to its installed base, many of which are net new customers. In addition, cross-selling of data quality tools to the existing PowerCenter installed base works well for Informatica. The installed base of its core data quality products (Informatica Data Quality [IDQ] and Informatica Data Explorer [IDE]) has broken through the 1,000 customers barrier, and a large proportion of customers consider Informatica's tools their data quality standard.
  • Informatica's data quality tools portfolio includes strong data profiling functionality (Data Explorer) and domain-agnostic parsing, standardization and matching capabilities (Data Quality), and with Informatica Data Quality 9, the vendor delivered a single unified profiling and cleansing platform. In 2009, Informatica upped the ante for its competition by acquiring longtime partner AddressDoctor and, more recently, it entered the MDM market through the acquisition of Siperian, supporting the market push of convergence among data quality and MDM. Following a significant reorganization, data quality is now treated as a "first-class citizen" within Informatica.
  • Customer references reported high satisfaction with the integrated nature of Informatica's data quality products with the vendor's flagship data integration solution, PowerCenter. Customers have also expanded the range of data quality domains in which they are using the tools, beyond customer data and into, for example, product data, financial data and healthcare data. While still new, Informatica Cloud 9 is the vendor's innovative approach to data quality (and data integration) in the cloud, providing solutions for back office and master data synchronization, migration and replication.

  • Early version 9.0 reference customers are struggling with some of the complexities of the product; in particular, workflow integration, security handling and rule management. The lack of robust data quality reporting is mentioned regularly as a weakness, and some customers are struggling with the efforts associated with the upgrade to version 9.0. Informatica's 9.01 release (June 2010) is supposed to address those issues by delivering an automated upgrade utility for IDE and IDQ, as well as additional reporting features.
  • Informatica continues to be challenged in its indirect sales channel for data quality products, because longtime infrastructure and applications partners have either acquired data quality technology themselves or are looking for other vendors for complementary data quality technology, as they now compete against Informatica in the data integration tools market. However, the AddressDoctor acquisition represents a new and significant indirect channel for Informatica within the CRM and data quality tools markets.
  • Informatica is increasingly competing against much larger infrastructure vendors with broader product sets for comprehensive data management technologies, including DBMS, BI and other capabilities. These vendors represent a significant competitive threat, since they are incumbents for many of the customers and prospects Informatica is targeting with its data quality tools message. Still, most customer references use Informatica data quality tools in a BI, migration or information governance context, and a growing number of customers also reported usage in combination with an MDM initiative.

Innovative Systems

Pittsburgh, Pennsylvania, U.S.,

Products: i/Lytics Enterprise Data Quality, i/Lytics Data Profiler, FinScan

Customer base: 600 (estimated)

  • Innovative Systems has competed in this market longer than most other vendors, with a history spanning nearly 35 years. Innovative's i/Lytics platform provides proven capabilities based on its deep experience in customer data matching and cleansing applications. i/Lytics provides strong support for both mainframe and distributed platforms, and enables data quality functionality to be exposed via service interfaces. Customer references indicate strong usage of the technology in large-scale batch deployments, and with a mix of other use cases, BI and data warehousing, data migrations, and MDM initiatives.
  • Complementing its financial services experience, Innovative continues to focus on its FinScan compliance watchlist-screening offerings, an area that is showing continued strong demand. Innovative is placing more emphasis on delivering i/Lytics functionality in a SaaS model, in line with a growing trend toward hosted and hybrid (a combination of on-premises and hosted) deployments in this market. Innovative's customer references include examples of both delivery models, although the vast majority (more than 75%) represent traditional on-premises software deployments.
  • Innovative's customer base reflects the vendor's strong experience in the banking and insurance industries — the financial services verticals comprise nearly 90% of the vendor's customers — although the rest of the customer base does include organizations in a variety of other industries such as retail, government and manufacturing. Customers routinely cite the vendor's deep experience in data quality, ability to deal with complexity, industry-specific experience (generally in financial services), and ability to understand business needs (not just technical requirements) as the main reasons for selecting Innovative in competitive evaluations. While slightly more than half of its revenue is derived from North America, Innovative also supports customers in Europe and has a more significant presence in Latin America than most of its competition. Customer references report a very positive service and support experience, and success with multiproject deployments.

  • With a strong emphasis on customer data quality issues, Innovative will be challenged to win new business or expand its presence in existing accounts when multidomain data quality capabilities are required. Customer references reflect very limited use of the technology in other data domains (less than 20% of the customer reference set was applying the technology to products/materials, financials, or other domains). Since market demand for multidomain support is already significant and growing, Innovative will need to rapidly address this weakness to improve its market presence beyond Niche Players status. In addition, without its own technology or deep partnerships in the closely related market of data integration tools, Innovative is at a disadvantage relative to its major competitors as buyers increasingly seek broader data management capabilities.
  • Innovative's product road map includes mostly technical enhancements to existing functionality, although the vendor states it is planning delivery of technology focused on call center applications. The vendor's data profiling and quality visualization capabilities continue to see limited market adoption, with a small fraction of customer references having adopted this functionality. Some of those customers that are using the profiling functionality cite this as an area of weakness. In addition, while Innovative's technology can support multilingual data, the lack of full Unicode capabilities becomes an increasing challenge as demand expands globally.
  • Innovative continues to struggle with gaining direct sales traction, rarely appearing in the competitive evaluations executed by prospective buyers. Lack of marketing visibility and brand recognition will be significant challenges for Innovative to address. More recently, the vendor has placed an increased emphasis on indirect sales, adding OEM and reseller partnerships both in North America and in other regions, as a way to address issue.

Pitney Bowes Business Insight

Stamford, Connecticut, U.S.,

Products: Spectrum Technology Platform

Customer base: 2,400 (estimated)

  • Pitney Bowes Business Insight (PBBI) has expanded beyond its traditional experience base of "customer data quality," by evolving its technology toward multiple data domains including location and more recently product data. The renaming of the product set to Spectrum Technology Platform is a significant step by the vendor to shed the perception of a "customer data only" capability, and recent customer interactions reflect growth in deployments involving non-customer/party data domains. Regardless, PBBI remains recognized for its strength in global name and address standardization and validation, matching-related capabilities (including linking and deduplication) and geocoding. This functionality is supported on a range of platforms, including the mainframe. In addition, PBBI is increasing its focus on off-premises (SaaS and cloud-based) deployment models, with a small but growing segment of the customer base leveraging these capabilities.
  • PBBI's rich capabilities for handling "location" data represent a significant differentiator and contributed to growth during 2009. Specifically, support for complex location-related data types, functionality for quality-assuring them, and performing analytical operations of a geographic nature represent areas of growing demand. For those customers moving to the newer versions of the technology platform on distributed platforms, the modular approach to the architecture and licensing of spectrum is seen as a positive.
  • PBBI retains a large installed base, making it one of the market-share leaders for data quality tools. The vendor's large scale and global footprint give it greater stability in comparison with many competitors of much smaller stature. Its revenue reflects an installed base that is very North-American-centric, with large enterprises making up most of its customers. Implementations reflect a range of use-cases with a bias toward data quality controls in operational applications and BI/data warehousing scenarios.

  • While PBBI is attempting to adapt its product capabilities and messaging to address multidomain data quality requirements, the overwhelming majority of customers limit their deployments to customer/party data. Recent customer references reflected a small minority (approximately 12%) of projects involving product/materials data, which is an increase from prior such analyses. PBBI established a partnership with Silver Creek Systems to more strongly support non-party data domains. While that vendor's acquisition by Oracle calls into question the longer-term viability of the relationship, to date no changes to the partnership have occurred and none are planned. PBBI had also established partnerships with numerous MDM solution vendors and is pursuing others. Recent acquisitions of some of the existing MDM partners (such as Initiate and Siperian) and entry of others into the data quality tools market (such as Oracle) raise "coopetition" challenges that PBBI must navigate and manage over time.
  • The vendor continues to see extremely limited adoption and use of its profiling, visualization and monitoring functionality, and customer references cite this as an area of weakness. Lack of proof points in this regard represents a substantial weakness for PBBI, since these are among the most rapidly growing areas of demand in the market. In addition, customers assessed the ease of installation and use of the technology as requiring improvement, citing a need for technically deep skills and frequent assistance from the vendor's product support services.
  • While PBBI offers a range of pricing models and options, mainframe-based customers (which represent the core of its customer base) continue to report challenges in negotiating the cost of upgrades and ongoing support/maintenance, as well as working through renegotiations of enterprise licenses, specifically for PBBI's older mainframe products.

SAP BusinessObjects

Walldorf, Germany,

Products: Data Quality Management, Data Insight, Data Services

Customer base: 5,000 (estimated)

  • SAP BusinessObjects provides good breadth of functional data quality capabilities, including data profiling and common data cleansing operations, which can be applied in diverse environments. The core data quality functionality in Data Quality Management enables the delivery of data quality services in an SOA context, and is used in the Data Services product (which combines data integration and the Data Quality Management functionality). Consistent with growing market demand for tightly integrated data integration and data quality functionality, Data Services is seeing increased adoption by SAP BusinessObjects customers. A majority of recent customer references reported they are using the data quality functionality via an implementation of Data Services. The vendor's product road map reflects a vision for expanding its data quality capabilities toward a broader data governance positioning, enabling data stewards and other non-IT roles to participate in the development of policies and rules.
  • SAP BusinessObjects has a substantial BI platform market presence and a large base of data quality tools customers. This creates significant cross-sell opportunities for the vendor to increase its data quality tools business. As a part of SAP, the vendor's growth prospects are further expanded via access to the global SAP applications customer base, where data quality challenges are prevalent. In particular, SAP BusinessObjects' data quality tools complement SAP's MDM offerings, which have been lacking rich data quality functionality. Customers perceive tight integration with SAP business applications and performance of the tools as strengths.
  • SAP BusinessObjects' strength in this market remains in applications of customer/party data quality, specifically in matching/linking, deduplication and name and address standardization and validation. The technology is proven for applications of this type and such implementations represent the vast majority (over 90% based on surveys of reference customers) of the installed base. In recent interactions with SAP BusinessObjects customers, more examples of use in non-party domains (for example, products and materials) are present, though these uses still represent a small minority of the overall customer activity.

  • Customer deployments continue to reflect relatively few cases where the technology is being applied in data domains beyond customer data (and similar "party"-oriented subject areas such as suppliers or employees), and customers rate the support for other domains as weaker than that provided by many other competitors. This is in contrast to adoption trends across the market as a whole, and something that SAP BusinessObjects must continue to improve in order to support the needs of SAP applications and MDM customers. To address these challenges, the vendor states that it will deliver "next generation" non-party functionality during the first half of 2011.
  • Data profiling remains an area of weakness for SAP BusinessObjects. The Data Insight product continues to show slow market adoption and customer references report limited use and significantly lower levels of satisfaction with the functionality, compared with the profiling offerings of most competitive vendors. SAP BusinessObjects' product road map calls for delivery of "next-generation" capabilities by YE10, continuing for the near term to leave a substantial gap in an area that is seen as a high priority by most buyers.
  • Customer references continue to raise concerns regarding the quality and timeliness of product support, pricing approach, the price-value ratio of SAP BusinessObjects' data quality offerings, and their overall experience in their relationship with the vendor. Many customers report challenges with product support, which often leads to degraded perceptions about the tools and other aspects of the vendor relationship. Because these perceptions are not unique to SAP in the data quality tools market, the vendor is attempting to address these issues by allocating dedicated resources to focus initially on the overall customer experience, starting by correcting product support challenges.

Trillium Software

Billerica, Massachusetts, U.S.,

Products: Trillium Software System, TS Discovery, TS Insight

Customer base: 900 (estimated)

  • Trillium Software, a division of marketing services provider Harte-Hanks, provides a broad suite of data quality tools, including data profiling (TS Discovery), core data quality components (Trillium Software System) and a data quality dashboard offering (TS Insight). Its data enrichment capabilities are focused on customer data (addresses, geocoding and watchlist compliance). Trillium is attempting to expand its positioning and capabilities beyond core data quality functions toward a broader data governance vision, offering a combination of technology and professional services aimed at data governance initiatives in the financial services industry.
  • Trillium continues to enjoy strong brand recognition and customer retention, appearing in competitive evaluations by new buyers with the greatest frequency among all vendors in the market. The customer base reflects a diversity of use cases, with growing adoption in MDM initiatives and in support of data governance programs. The vast majority of customer references (over 90%) are using the technology in the customer/party data domain, with a small but growing set of implementations focused on other domains such as products/materials.
  • Customer references cite functional capabilities (specifically, profiling, base data manipulation operations such as parsing and standardization, matching, and support for real-time deployments), performance, and the vendor's experience in the data quality discipline as their main reasons for choosing Trillium. In addition, customers generally report a positive service and support experience, and a positive overall perception about their relationship with the vendor. Specific initiatives reflected in the vendor's product road map for 2010 include improved functionality for reuse of data quality rules and workflows, and an increased ability to expose rules in an SOA context. In order to expand its multidomain and vertical industry experience, during 2010 Trillium will be introducing a marketplace (called TrilliumApps) where Trillium customers can find and share data quality rules and workflows for various types of data and industry-specific processes.

  • As Trillium targets a non-IT audience and business roles with its vision for data governance, it will need to continue to improve the usability of the technology. Customer references rate Trillium's ease of deployment and use as lacking relative to competitive offerings, with numerous customers citing technical complexity and a steep learning curve. This represents an important area of improvement for Trillium, as the ownership and maintenance of data quality rules will increasingly be a component of business-user roles rather than IT roles.
  • While Trillium continues to leverage and add partnerships (such as Oracle and Syncsort), it is the only Leader in this market that focuses solely on data quality and lacks complementary offerings in overlapping markets such as data integration tools and MDM solutions. Although demand for stand-alone data quality tools remains significant, the attention of buyers continues to shift toward unified data integration, MDM and data quality capabilities. Trillium's data integration tools partners will help it to address this trend, but those same partners will increasingly enter the data quality tools space (for example, Oracle's acquisition of Silver Creek Systems), putting the relationships at risk.
  • Trillium's functionality, marketing and product road map has, in the past, been largely geared toward data quality issues in customer/party data, and many customers and prospects still perceive the vendor in that light. Customer reference interactions show more examples in the product/materials and other domains than in 2009, although the numbers in relative terms remain small and this represents a weakness that Trillium must address. With market demand having become multidomain in nature, various competitors are able to offer more (and more mature) customer examples that support this trend.


Pforzheim, Germany,

Products: Data Quality Explorer, Data Quality Batch Suite, Data Quality Real-Time Services, Data Quality Monitor

Customer base: 900 (estimated)

  • Uniserv is the largest pure-play provider of data quality solutions in Europe, with over 40 years of history, more than any other vendor in this roundup. The vendor focuses almost exclusively on customer data, name and address verification and geocoding. About 80% of Uniserv's revenue and customers are in Germany and France, but the vendor has also sold in other European countries, and is showing increased traction in the U.S. The newly founded business unit focusing on SAP integration is demonstrating impressive growth.
  • Uniserv has found solid traction in on-premises data quality solutions, and a number of customer references also report that they are using the vendor's SaaS delivery model. Almost all references report using the vendor's product equally in both batch and real-time processing environments. Uniserv has expanded its product portfolio and through a reseller agreement is now also providing comprehensive data quality monitoring and data profiling with its Data Quality Explorer product.
  • The Uniserv product integrates with all relevant CRM systems, including Microsoft Dynamics, SAP, Siebel, PeopleSoft, and Update. Uniserv is fully Unicode-enabled and is one of very few vendors that supports such a wide variety of system platforms, from all major Windows and Unix/Linux versions to IBM mainframes under z/OS and Virtual Storage Extended (z/VSE), as well as IBM System i and Siemens BS2000.

  • As many organizations start to view data quality as a domain-agnostic issue, Uniserv's strong focus on address standardization and validation will put it at a competitive disadvantage compared with other providers that have a reputation for addressing a broader range of data quality issues; for example, product data or financial data. While Uniserv covers address validation for almost 200 countries, only a few references have reported using Uniserv's product in other data domains. Still, the first customer references (one example being a large German retailer) report using Uniserv to generate consistency of product data.
  • Uniserv is an established brand for matching, merging, cleansing, and address and bank data verification technologies, but it does not serve increasingly popular areas such as data quality dashboards. In addition, the newly available profiling product shows no traction at this point. Reference customer feedback on Uniserv's technical support, professional services, ease of implementation and pricing is about average, with the occasional praise and complaint.
  • While Uniserv acknowledges the ongoing trend of convergence among data quality, data integration and master data management markets, the vendor has not shown any response. Uniserv's strong concentration on its direct sales force, and its lack of large international alliances with SIs and ISVs that use Uniserv technology as OEMs, put the vendor under increasing pressure from the larger infrastructure providers.

© 2010 Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. or its affiliates. Reproduction and distribution of this publication in any form without prior written permission is forbidden. The information contained herein has been obtained from sources believed to be reliable. Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information. Although Gartner's research may discuss legal issues related to the information technology business, Gartner does not provide legal advice or services and its research should not be construed or used as such. Gartner shall have no liability for errors, omissions or inadequacies in the information contained herein or for interpretations thereof. The opinions expressed herein are subject to change without notice.

Acronym Key and Glossary Terms

business intelligence

customer relationship management

customer data quality

extraction, transformation and loading

independent software vendor

master data management

software as a service

system integrator

service-oriented architecture

value-added reseller

Vendors Added or Dropped

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor appearing in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. This may be a reflection of a change in the market and, therefore, changed evaluation criteria, or a change of focus by a vendor.

Evaluation Criteria Definitions

Ability to Execute

Product/Service: Core goods and services offered by the vendor that compete in/serve the defined market. This includes current product/service capabilities, quality, feature sets, skills, etc., whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.

Overall Viability (Business Unit, Financial, Strategy, Organization): Viability includes an assessment of the overall organization's financial health, the financial and practical success of the business unit, and the likelihood of the individual business unit to continue investing in the product, to continue offering the product and to advance the state of the art within the organization's portfolio of products.

Sales Execution/Pricing: The vendor's capabilities in all pre-sales activities and the structure that supports them. This includes deal management, pricing and negotiation, pre-sales support and the overall effectiveness of the sales channel.

Market Responsiveness and Track Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor's history of responsiveness.

Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization's message in order to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This "mind share" can be driven by a combination of publicity, promotional, thought leadership, word-of-mouth and sales activities.

Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements, etc.

Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.

Completeness of Vision

Market Understanding: Ability of the vendor to understand buyers' wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen and understand buyers' wants and needs, and can shape or enhance those with their added vision.

Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.

Sales Strategy: The strategy for selling product that uses the appropriate network of direct and indirect sales, marketing, service and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.

Offering (Product) Strategy: The vendor's approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature set as they map to current and future requirements.

Business Model: The soundness and logic of the vendor's underlying business proposition.

Vertical/Industry Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including verticals.

Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes.

Geographic Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the "home" or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.