Magic Quadrant for Data Integration Tools
25 November 2009

Ted Friedman, Mark A. Beyer, Eric Thoo

Gartner RAS Core Research Note G00171986

Despite challenging economic conditions in 2009, activity in the data integration tools market remained strong. Demand for lower-cost tools and faster time to value, new market entrants, and ongoing consolidation via acquisitions of specialists put pressure on both niche and established vendors.

What You Need to Know

During 2009, vendors in the data integration tools market experienced increased scrutiny from buyers with regard to pricing, cost models, time to implementation and the quality of service and support. Customer organizations reacted to imperatives to reduce costs and derive a new interpretation of value from their investments. The new value concept is the increased scrutiny of the audit trail and the lineage of data integration efforts, and the ability to expose it as a result of the demand for greater transparency into how information is used in the current economy. All of these trends created an opportunity for both established vendors and new market entrants with alternative pricing models and/or "good enough" functionality to gain market mind share and traction. At the same time, larger vendors and those with broader functional capabilities furthered the trend of market consolidation (by acquiring small specialist providers) and support for multiple styles of data delivery in a single toolset. Most of this consolidation has resulted in suite-style delivery methods, with the acquiring vendors focusing first on interoperability via metadata sharing and service calls, and integration being delayed while the diverse acquisitions are rationalized for future delivery. In addition to functional capabilities that map to the specific data integration requirements of contemporary business initiatives (such as master data management [MDM], business intelligence and performance management, and data and system consolidation and modernization), IT leaders focusing on data integration initiatives must also consider non-product issues, such as vendor viability given economic conditions, the depth of partnerships, the availability of skills, and the degree of satisfaction with service and support as perceived by the existing customer base.

Magic Quadrant

Figure 1. Magic Quadrant for Data Integration Tools

Figure 1.Magic Quadrant for Data Integration Tools

Source: Gartner (November 2009)

Market Overview

The discipline of data integration comprises the practices, architectural techniques and tools for achieving consistent access to, and delivery of, data across the spectrum of data subject areas and data structure types in the enterprise, to meet the data consumption requirements of all applications and business processes. As such, data integration capabilities are at the heart of the information-centric infrastructure and will power the frictionless sharing of data across all organizational and system boundaries. Contemporary pressures are leading to an increased investment in data integration in all industries and geographic regions. Business drivers, such as the imperative for speed to market and the agility to change business processes and models, are forcing organizations to manage their data assets differently. Simplification of processes and the IT infrastructure is necessary to achieve transparency, and transparency requires a consistent and complete view of the data, which represents the performance and operation of the business. Data integration is a critical component of an overall enterprise information management (EIM) strategy and information infrastructure that can address these data-oriented issues.

With the ongoing evolution of the data integration tools market, separate and distinct submarkets continue to converge, both at the vendor level and the technology level. This is being driven by buyers' demands. Specifically, organizations increasingly acknowledge a diversity of data integration problem types that are supported by equally diverse architectural styles and patterns for data delivery. It is also being driven by vendors' actions — specifically, vendors in individual data integration submarkets organically expanding their capabilities into neighboring areas, and acquisition activity bringing vendors from multiple submarkets together. The result is a progressively maturing market for complete data integration tools that address a range of different data integration styles based on common design tooling, metadata and runtime architecture. This market has supplanted the former data integration tools submarkets, such as extraction, transformation and loading (ETL), and represents the competitive landscape in which Gartner evaluates vendors for placement within this Magic Quadrant. While vendor vision and product road maps increasingly exhibit these characteristics, vendors that supply multiple types of integration techniques generally exhibit an overwhelming strength in one — even those vendors that acquired market leaders in a second submarket.

Gartner estimates the size of the market for data integration tools at approximately $1.34 billion as of the end of 2008, and forecasts growth at a five-year compound annual rate of approximately 9.4%. While the forecast growth has been substantially curtailed due to current economic conditions, this growth rate is very healthy compared with most other software segments. Services revenue from implementations of data integration tools is also growing, with the time and effort required to implement the tools varying widely depending on the scope and complexity of the deployment.

Substantial changes in the positioning of vendors in this iteration of the Magic Quadrant are driven not only by vendors' activities in delivering new product capabilities and their degree of success in targeting contemporary demands, but also by the ongoing evolution of key evaluation criteria. In addition, an increased emphasis on overall viability, customer service and support and pricing approaches, as well as the perception of the customer base regarding value relative to cost model, caused some vendors to show an increased ability to execute, while others experienced a dramatically reduced ability to execute. The resulting vendor positioning reflects a clear stratification of vendors in this market into several clusters.

  • Vendors in the strong leadership cluster, such as Informatica and IBM, exhibit a significant emphasis on their roots in bulk/batch-oriented data integration patterns — a core requirement in this market — while providing high-quality service and support to their customers and having a perceived high value. However, the challenges they present to customers regarding costs (high price points and sophisticated products) perpetuate the substantial gap in their ability to execute relative to "perfect."
  • A cluster of vendors represented by the arc that spans Challengers, new and weak Leaders and stronger Visionaries includes many providers that are still building their brand awareness in this market (such as Pervasive Software, iWay Software and SAS/DataFlux), and/or that are at an early stage in the evolution of their technology toward the concept of a comprehensive, well-integrated data integration toolset spanning a range of data delivery styles (Oracle and SAP BusinessObjects exhibit this characteristic). This group of vendors is well positioned to capture substantial market demand as their product capabilities and marketing approaches mature, but also because they can leverage their brand recognition in wider data management areas.
  • The remaining two clusters of vendors, positioned predominantly in the Niche Players quadrant, reflect the characteristics of either narrow functionality (for example, a substantial bias toward a single style of data delivery, typically ETL); a very nascent or declining brand recognition in the market; or a lack of ability to keep pace with evolving buyer demands for other functionality such as richer metadata management and synergy with data quality capabilities. Syncsort, Pitney Bowes Business Insight, ETI and Open Text exhibit various aspects of these characteristics.

For the first time, a vendor offering open-source versions of its data integration tools is present in the Magic Quadrant. In prior analyses, none of the open-source offerings met the functional inclusion criteria, nor did providers offering solutions based on open-source models gain adoption in the market substantial enough to satisfy the quantitative inclusion criteria (revenue and/or number of maintenance-paying customers). During 2009 this situation changed, with Talend delivering the necessary product capabilities and also reaching a significant state of adoption with its subscription offering. While open-source solutions are visionary in some respects regarding the data integration tools market (offering alternative pricing and licensing models and a community-based approach to product development), they generally exhibit functional gaps and limited mind share relative to the market leaders. The result is a visionary delivery channel that offers a different method of customer interaction in terms of product design. But it remains to be seen whether this model proves stronger than traditional methods such as commercially licensed software and customer forums driving tightly vendor-controlled product development.

Software as a service (SaaS) and cloud-based models for data integration are becoming broader subjects of discussion among vendors in the market for data integration tools. In addition, many organizations that do not have the budget, skills or resources to deploy sophisticated data integration tools in-house are beginning to explore the availability of these capabilities in a SaaS model, hosted by an external provider. Few examples of general data integration as a service are available, but data integration as a service solutions that solve very specific data synchronization problems continue to emerge. As the adoption of SaaS and cloud-based models continues to rise, vendors will increase the percentage of their functional capabilities that is available to customers in an "as a service" fashion.

While not on the Magic Quadrant, certain specialty tools that focus on metadata-driven semantics are beginning to gain customers. All these vendors have relatively small customer bases. Customer organizations and competing vendors alike are advised to review the functional roles of these tools with respect to their use of metadata, semantic/logical modeling and services capability. These tools support model-to-model interpretation of repository-stored data, or focus on role and context management by creating workflows that expose business and transformation rules creation to data stewards and other governance roles. Examples of these tools and vendors are BIReady, expressor software, Kalido and WhereScape. Another class of tools and vendors making progress in the market is that which specializes in service-oriented architecture (SOA)-style data services delivery, or strong architectural approaches such as Ab Initio, Metatomix, SOALogix and SOA Software.

Market Definition/Description

The data integration tools market comprises vendors that offer software products to enable the construction and implementation of data access and delivery infrastructure for a variety of data integration scenarios, including:

  • Data acquisition for business intelligence (BI) and data warehousing. Extracting data from operational systems, transforming and merging that data, and delivering it to integrated data structures for analytic purposes. BI and data warehousing remain mainstays of the demand for data integration tools.
  • Creation of integrated master data stores. Enabling the consolidation and rationalization of the data, representing critical business entities such as customers, products and employees. MDM may or may not be subject-based, and data integration tools can be used to build the data consolidation and synchronization processes that are key to success.
  • Data migrations/conversions. Traditionally addressed most often via the custom coding of conversion programs, data integration tools are increasingly addressing the data movement and transformation challenges inherent in the replacement of legacy applications and consolidation efforts during mergers and acquisitions.
  • Synchronization of data between operational applications. Similar in concept to each of the previous scenarios, data integration tools provide the capability to ensure database-level consistency across applications, both on an internal and interenterprise basis, and in a bidirectional or unidirectional manner.
  • Interenterprise data sharing. Organizations are increasingly required to provide data to and receive data from external trading partners (customers, suppliers and others). Data integration tools may be relevant for certain types of these requirements, which often consist of the same types of data access, transformation and movement components found in other common use cases.
  • Delivery of data services in an SOA context. An architectural technique, rather than a usage of data integration itself, data services are the emerging trend for the role and implementation of data integration capabilities within SOAs. Data integration tools will increasingly enable the delivery of many types of data services.

Gartner has defined several classes of functional capabilities that vendors of data integration tools must possess to deliver optimal value to organizations in support of a full range of data integration scenarios:

  • Connectivity/adapter capabilities (data source and target support).
  • Data delivery capabilities.
  • Data transformation capabilities.
  • Metadata and data modeling capabilities.
  • Design and development environment capabilities.
  • Data governance capabilities (data quality, profiling and mining).
  • Deployment options and runtime platform capabilities.
  • Operations and administration capabilities.
  • Architecture and integration.
  • Service-enablement capabilities.

Connectivity/Adapter Capabilities (Data Source and Target Support)

The ability to interact with a range of different data structures types, including:

  • Relational databases.
  • Legacy and nonrelational databases.
  • Various file formats.
  • XML.
  • Packaged applications such as CRM and supply chain management.
  • Industry-standard message formats such as electronic data interchange (EDI), SWIFT and Health Level Seven (HL7).
  • Message queues, including those provided by application integration middleware products and standards-based products (such as Java Message Service [JMS]).
  • Emergent data types, such as e-mail, websites, office productivity tools and content repositories.

In addition, data integration tools must support different modes of interaction with this range of data structure types, including:

  • Bulk acquisition and delivery.
  • Granular trickle-feed acquisition and delivery.
  • Changed-data capture (the ability to identify and extract modified data).
  • Event-based acquisition (time-based or data-value-based).

Data Delivery Capabilities

The ability to provide data to consuming applications, processes and databases in a variety of modes, including:

  • Physical bulk data movement between data repositories.
  • Federated views formulated in memory.
  • Message-oriented movement via encapsulation.
  • Replication of data between homogeneous or heterogeneous database management systems (DBMSs) and schemas.

In addition, support for the delivery of data across the range of latency requirements is important:

  • Scheduled batch delivery.
  • Streaming/real-time delivery.
  • Event-driven delivery.

Data Transformation Capabilities

Built-in capabilities for achieving data transformation operations of varying complexity, including:

  • Basic transformations, such as data type conversions, string manipulations and simple calculations.
  • Intermediate-complexity transformations, such as lookup and replace operations, aggregations, summarizations, deterministic matching and the management of slowly changing dimensions.
  • Complex transformations, such as sophisticated parsing operations on free-form text and rich media.

In addition, the tools must provide facilities for developing custom transformations and extending packaged transformations.

Metadata and Data Modeling Capabilities

As the increasingly important heart of data integration capabilities, metadata management and data modeling requirements include:

  • Automated discovery and acquisition of metadata from data sources, applications and other tools.
  • Data model creation and maintenance.
  • Physical to logical model mapping and rationalization.
  • Defining model-to-model relationships via graphical attribute-level mapping.
  • Lineage and impact analysis reporting, via graphical and tabular format.
  • An open metadata repository, with the ability to share metadata bidirectionally with other tools.
  • Automated synchronization of metadata across multiple instances of the tools.
  • Ability to extend the metadata repository with customer-defined metadata attributes and relationships.
  • Documentation of project/program delivery definitions and design principles in support of requirements definition activities.
  • Business analyst/end-user interface to view and work with metadata.

Design and Development Environment Capabilities

Facilities for enabling the specification and construction of data integration processes, including:

  • Graphical representation of repository objects, data models and data flows.
  • Workflow management for the development process, addressing requirements such as approvals and promotions.
  • Granular role-based and developer-based security.
  • Team-based development capabilities, such as version control and collaboration.
  • Functionality to support reuse across developers and projects, and to facilitate the identification of redundancies.
  • Support for testing and debugging.

Data Governance Capabilities (Data Quality, Profiling and Mining)

Mechanisms to help the understanding and assurance of data quality over time, including interoperability with:

  • Data profiling tools.
  • Data mining tools.
  • Data quality tools.

Deployment Options and Runtime Platform Capabilities

Breadth of support for hardware and operating systems on which data integration processes may be deployed, and the choices of delivery model; specifically:

  • Mainframe environments, such as IBM z/OS and z/Linux.
  • Midrange environments, such as IBM System i (formerly AS/400) or HP Tandem.
  • Unix-based environments.
  • Wintel environments.
  • Linux environments.
  • Traditional on-premises (at the customer site) installation and deployment of software.
  • Hosted off-premises software deployment (SaaS model).
  • Server virtualization (support for shared, virtualized implementations).

Operations and Administration Capabilities

Facilities for enabling adequate ongoing support, management, monitoring and control of data integration processes implemented via the tools, such as:

  • Error-handling functionality, both predefined and customizable.
  • The monitoring and control of runtime processes.
  • The collection of runtime statistics to determine use and efficiency, as well as an application-style interface for visualization and evaluation.
  • Security controls, for both data "in flight" and administrator processes.
  • A runtime architecture that ensures performance and scalability.

Architecture and Integration

The degree of commonality, consistency and interoperability between the various components of the data integration toolset, including:

  • A minimal number of products (ideally one) supporting all data delivery modes.
  • Common metadata (single repository) and/or the ability to share metadata across all components and data delivery modes.
  • A common design environment to support all data delivery modes.
  • The ability to switch seamlessly and transparently between delivery modes with minimal rework.
  • Interoperability with other integration tools and applications, via certified interfaces and robust application programming interfaces (APIs).
  • Efficient support for all data delivery modes regardless of runtime architecture type (centralized server engine versus distributed runtime).

Service-Enablement Capabilities

As acceptance of data services concepts continues to grow, data integration tools must exhibit service-oriented characteristics and provide support for SOA deployments, such as:

  • The ability to deploy all aspects of runtime functionality as data services.
  • Management of publication and testing of data services.
  • Interaction with service repositories and registries.
  • Service enablement of the development and administration environments, such that external tools and applications can dynamically modify and control the runtime behavior of the tools.

Inclusion and Exclusion Criteria

For vendors to be included in this Magic Quadrant, they must meet the following functional requirements.

  • They must possess within their technology portfolio the subset of capabilities identified by Gartner as most critical from within the overall range of capabilities expected in data integration tools. Specifically, vendors must deliver the following functional requirements:
    • Range of connectivity/adapter support (sources and targets): native access to relational DBMS products plus access to nonrelational legacy data structures, flat files, XML and message queues.
    • Mode of connectivity/adapter support (against a range of sources and targets): bulk/batch and change data capture.
    • Data delivery modes support: bulk/batch (ETL-style) delivery, plus at least one additional mode (federated views, message-oriented delivery or data replication).
    • Data transformation support: at a minimum, packaged capabilities for basic transformations (such as data type conversions, string manipulations and calculations).
    • Metadata and data modeling support: automated metadata discovery, lineage and impact analysis reporting, and an open metadata repository including mechanisms for bidirectional sharing of metadata with other tools.
    • Design and development support: graphical design/development environment and team development capabilities (such as version control and collaboration).
    • Data governance support: ability to interoperate at a metadata level with data profiling and/or data quality tools.
    • Runtime platform support: Windows, Unix or Linux operating systems.
    • Service enablement (ability to deploy functionality as services conforming to SOA principles).
  • In addition, vendors must satisfy the following quantitative requirements regarding their market penetration and customer base:
    • They must generate at least $20 million of annual software license revenue from data integration tools or maintain at least 300 maintenance-paying customers for their data integration tools.
    • They must support data integration tools customers in at least two of the major geographic regions (North America, Latin America, Europe and Asia/Pacific).
    • They must have customer implementations that reflect the use of the tools at an enterprise (cross-departmental and multiproject) level.

We excluded vendors that focus on only one specific data subject area (for example, the integration of customer data only), a single industry, or only their own data models and architectures.

Many other vendors of data integration tools exist beyond those included in this Magic Quadrant. However, most do not meet the above criteria and, therefore, we have not included them in this analysis. Market trends in the past three years indicate that organizations want to use data integration tools that provide flexible data access, delivery and operational management capabilities within a single vendor solution. Excluded vendors frequently provide products to address one very specific style of data delivery (for example, only data federation) but cannot support other styles. Others provide a range of functionality, but operate only in a specific technical environment. Still others operate only in a single region or support only narrow, departmental implementations. Some vendors meet all the functional, deployment and geographic requirements but are very new to the data integration tools market, and have limited revenue and few production customers. The following vendors are sometimes considered by Gartner clients alongside those appearing in the Magic Quadrant, when deployment needs are aligned with their specific capabilities, or are newer market entrants with relevant capabilities:

Ab Initio, Lexington, Massachusetts, U.S., — Application development toolbox (Co>Operating System) and component library for metadata management and data integration.

Alebra Technologies, New Brighton, Minnesota, U.S., — Parallel Data Mover for cross-platform file and database copying and sharing.

Apatar, Chicopee, Massachusetts, U.S., — Open-source data integration tools focused on ETL and data federation scenarios.

Attunity, Burlington, Massachusetts, U.S., — A range of data-integration-oriented products, including adapters (Attunity Connect), change data capture (Attunity Stream) and data federation (Attunity Federate) for various platforms and database/file types.

BackOffice Associates, South Harwich, Massachusetts, U.S., — Offers services and technology, including data integration capabilities, for data migrations, with a focus on SAP and other ERP environments.

BIReady, New York, New York, U.S. and Langbroek, The Netherlands, — Dynamic model resolution tool for rationalizing, deploying and populating analytic models coupled with a data integration engine for transformations between the models.

CA, Islandia, New York, U.S., — Advantage Data Transformer provides ETL-oriented data integration. InfoRefiner provides replication and propagation capabilities for mainframe data repositories.

CDB Software, Houston, Texas, U.S., — CDB/Delta provides change data capture and replication capabilities for IBM DB2 on the z/OS platform.

Columba Global Systems, Dublin, Ireland, — Positioned as "data fusion platform" technology, the Exprimer solution supports federated approaches to data integration.

Composite Software, San Mateo, California, U.S., — Composite Information Server provides data federation capabilities and supports the delivery of data access services.

Datawatch, Chelmsford, Massachusetts, U.S., — The Monarch Data Pump product provides ETL functionality with a bias toward extracting data from report text, PDF files, spreadsheets and other less structured data sources.

Denodo Technologies, Palo Alto, California, U.S. and Madrid, Spain, — The Denodo Platform provides data federation and mashup-enablement capabilities for joining structured data sources with data from websites, documents and other less structured repositories.

Embarcadero Technologies, San Francisco, California, U.S., — The DT/Studio ETL tool provides support for a range of relational and other data sources, and integrates with the vendor's data modeling and database design tools.

ETL Solutions, Bangor, U.K., — Transformation Manager provides a metadata-driven toolset for the authoring, testing, debugging and deployment of various data integration requirements.

expressor software, Burlington, Massachusetts, U.S. — The expressor product is based on a semantic approach to designing and managing data integration processes.

GoldenGate Software, San Francisco, California, U.S., — Real-time, heterogeneous data replication capabilities provided by the Transactional Data Management (TDM) software platform.

GSS Group, Markham, Ontario, Canada, — VIGILANCE Xpress is a Web-based solution for SQL Server data marts supporting Microsoft's .NET Framework, SQL Server and SQL Server Reporting Services.

HiT Software, San Jose, California, U.S., — Offers database replication (DBMoto), database-to-XML transformation and mapping (Allora) and DB2 connectivity products.

Ikan Group, Mechelen, Belgium, — Java-based ETL technology named ETL4ALL, supporting transformation servers on Windows, Linux, Unix and IBM iSeries.

Innovative Routines International (CoSort), Melbourne, Florida, U.S., — The Fast Extract and SortCL tools provide for rapid unloading and transformation of data in Oracle databases in support of ETL processes.

Jitterbit, Oakland, California, U.S., — Freely downloadable software with a focus on both application integration (event- and message-based) and data integration.

Kalido, Burlington, Massachusetts, U.S. and London, U.K., — Kalido's Active Information Management software enables dynamic data modeling and change management for data warehouses and master data environments.

Metatomix, Dedham, Massachusetts, U.S., — Follows a semantics-based approach to the creation of data services and federated views of data across multiple data sources.

Pentaho, Orlando, Florida, U.S., — A provider of open-source BI solutions, Pentaho has added data integration tools to its portfolio by leveraging the Kettle open-source project and providing services and support.

Progress Software, Bedford, Massachusetts, U.S., — The DataXtend and DataDirect product lines provide tools for data access, replication and synchronization.

Quest Software, Aliso Viejo, California, U.S., — SharePlex provides real-time replication support for Oracle DBMS environments and is targeted primarily at high-availability applications.

Red Hat/MetaMatrix, Raleigh, North Carolina, U.S., — The MetaMatrix Server, Enterprise and Query products support the creation of data models and model-driven federated views of data.

Relational Solutions, Westlake, Ohio, U.S., — The BlueSky Integration Studio provides ETL capabilities in a simplified, low-cost toolset that runs in the Windows environment.

Safe Software, Surrey, British Columbia, Canada, — The FME platform delivers ETL capabilities for spatially oriented data sources commonly used in geographic information system applications.

SchemaLogic, Kirkland, Washington, U.S., — Enables the creation and maintenance of data models (Workshop) and business models (SchemaServer), and the ability to propagate models and data across applications (Integration Service).

Scribe Software, Bedford, New Hampshire, U.S., — Provides data migration and integration solutions supporting deployments of business applications, with a specific focus on Microsoft Dynamics.

Seagull Software, Atlanta, Georgia, U.S., — Offers SmartDB for data migrations to the Oracle E-Business Suite.

SnapLogic, San Mateo, Califormia, U.S., — Dataflow supports real-time and federated integration of data with a focus on diverse data sources, including SaaS- and cloud-based sources, and via Web-oriented architectural approaches.

SOALogix, Reston, Virginia, U.S., — The Confero SOA product offers a platform for the creation and delivery of data services for SOA.

Software AG, Darmstadt, Germany, — The CentraSite product provides data and metadata federation capabilities and is geared toward SOA deployments. The vendor's webMethods product line provides process-oriented integration capabilities.

Software Labs, Roseville, California, U.S., — The xFusion Studio product provides ETL functionality positioned toward a range of use cases including BI and migrations.

Sypherlink, Dublin, Ohio, U.S., — Metadata discovery and mapping via Harvester, and access to data sources for the creation of integrated views via Exploratory Warehouse.

TigerLogic (formerly Raining Data), Irvine, California, U.S., — TigerLogic XDMS provides XML-based data federation and persistence, as well as the delivery of data services.

Vamosa, Glasgow, U.K., and Boston, Massachusetts, U.S., — Provides content integration and migration, aimed at the synchronization and consolidation of document repositories, via its Content X-Change and Content Migrator products.

Vision Solutions, Irvine, California, U.S., — Real-time database replication functionality is provided by the Vision Replicate1 product.

WhereScape, Portland, Oregon, U.S., — WhereScape RED enables the rapid creation and maintenance of data warehouses, including ETL functionality.

XAware, Colorado Springs, Colorado, U.S., — Provides support for the access, integration and service enablement of data sources via its XA-Suite product.

Vendors Added


Vendors Dropped

Sun Microsystems — removed from the Magic Quadrant due to a shift in positioning away from data integration, and a lack of demonstrated market adoption of bulk/batch-oriented data delivery functionality.

Tibco — removed from the Magic Quadrant because it has ceased to offer data integration technology packaged specifically for bulk/batch data delivery (such as ETL).

Evaluation Criteria

Ability to Execute

To emphasize the need to address a range of data delivery styles, metadata management strength and other technical requirements of enterprisewide data integration activities, the Ability to Execute criteria in the data integration tools market include a strong emphasis on product capabilities. The Product/Service criterion includes all the major categories of functionality described in the "Market Definition/Description" section. In this iteration of the Magic Quadrant, in addition to the importance of the functional points that are part of the market inclusion criteria, we continued to increase the emphasis on metadata management capabilities and support for data governance via data quality capabilities. With the challenging budget constraints under which most technology user organizations and providers are operating, the Overall Viability criteria of Ability to Execute, while still at a "standard" weighting level, receives increased focus and priority in this iteration of the Quadrant. Most significantly, the weighting of Customer Experience (which includes service and support, customer perceptions of pricing models and price points relative to value, general customer satisfaction, and the quality of customer references) was increased for this iteration. This reflects the substantial increase in expectations from buyers in the market (also partly due to economic pressures) to receive more rapid return on their investment in the tools, to have reduced ongoing TCO, and for technology providers to deliver a smooth and high-quality support and service experience. These changes in the criteria weightings have resulted in numerous shifts in vendor positioning on the Quadrant relative to prior iterations.

Table 1. Ability to Execute Evaluation Criteria

Evaluation Criteria
Overall Viability (Business Unit, Financial, Strategy, Organization)
Sales Execution/Pricing
Market Responsiveness and Track Record
Marketing Execution
Customer Experience
no rating

Source: Gartner (November 2009)


Completeness of Vision

The Completeness of Vision criteria most strongly emphasize an overall market understanding. These criteria include an assessment of the degree to which the vendor establishes market trends and direction, as well as the vendor's ability to capitalize on market trends and survive disruptions. Both of these characteristics are crucial in the data integration tools market because of the volatility introduced by merger and acquisition activity, as well as the increasing impact on the market of the world's largest software vendors. In addition, we place a high weighting on Innovation to stress the importance of vendors developing new approaches to data integration that change the economics, scale and impact of this technology. Specific to Offering (Product) Strategy, another highly weighted criterion, a key consideration is the degree of openness of the vendors' offerings. For success in this market, vendors must deliver independence from their own data models and architectures, and be capable of easily interoperating with the architectures and technologies of other vendors. The remaining criteria receive moderate weightings with a slight emphasis on Sales Strategy and Geographic Strategy, given the rapidly expanding size of the market.

Table 2. Completeness of Vision Evaluation Criteria

Evaluation Criteria
Market Understanding
Marketing Strategy
Sales Strategy
Offering (Product) Strategy
Business Model
Vertical/Industry Strategy
no rating
Geographic Strategy

Source: Gartner (November 2009)



Leaders in the data integration tools market are front runners in the convergence of single-purpose tools into an offering that supports a range of data delivery styles. These vendors are strong in the more traditional data integration patterns. They also support newer patterns and provide capabilities that enable data services in the context of SOA. Leaders have significant mind share in the market, and resources that are skilled in their tools are readily available. These vendors establish market trends, to a large degree, by providing new functional capabilities in their products, and by identifying new types of business problems where data integration tools can bring significant value. Examples of deployments that span multiple projects and types of use case are commonplace in their customer base.


Challengers in the data integration tools market are well positioned in light of the key trends in the market, such as the need to support multiple styles of data delivery. However, they may not provide comprehensive breadth of functionality, or they may be limited to specific technical environments or application domains. In addition, their vision may be hampered by the lack of a coordinated strategy across the various products in their data integration tools portfolio. Challengers can vary significantly with regard to their financial strength and global presence. They are often large players in related markets that have only recently placed an emphasis on data integration tools. Challengers' customer base is generally substantial in size, though implementations are often of a single project nature, or reflect multiple projects of a single type (for example, all ETL-oriented use cases).


Visionaries in the data integration tools market will have a solid understanding of the emerging technology and business trends, or a position that is well aligned with current demand but lacks market awareness or credibility beyond their customer base or outside a single application domain. Challengers also may also fail to provide a comprehensive set of product capabilities. Visionaries may be new market entrants lacking the installed base and global presence of larger vendors, though they could also be well established, large players in related markets that have only recently placed an emphasis on data integration tools.

Niche Players

Niche Players in the data integration tools market have gaps in both vision and ability to execute, often lacking key aspects of product functionality and/or exhibiting a narrow focus within their own architecture and installed base. These vendors have little mind share in the market and are not recognized as proven providers of data integration tools for enterprise-class deployments. Many Niche Players have very strong offerings for a specific range of data integration problems (for example, a particular set of technical environments or application domains) and deliver substantial value for their customers in that segment. Niche players are notable in that they meet all the inclusion criteria for the Magic Quadrant, thereby differentiating them from excluded vendors.

Vendor Strengths and Cautions


Austin, Texas, U.S.

Products: ETI Solution, ETI Built-To-Order Integration, ETI Change Data Capture

Customer base: 300

  • ETI has a long association with the discipline of data integration, having been one of the earliest vendors offering technology in the original ETL tools market. With its mature and proven code-generating architecture that supports a range of platforms, ETI has historically been focused on physical, bulk data movement and delivery. The recent addition of change data capture (CDC) capabilities enables ETI to expand its adoption in lower-latency, granular data delivery patterns.
  • ETI continues to leverage its strength in supporting multiple hardware and operating system environments, including the mainframe and legacy data sources. It is often seen in sectors such as federal government and financial services, where these characteristics, along with large data volumes and a high degree of complexity, are common. This reflects the vendor's roots in the market and is characteristic of most of its customer base, and these capabilities serve as a differentiator from much of the competition. As such, ETI's strategy and vision are focused largely on data integration scenarios involving the mainframe, as well as cultivating partnerships with independent software vendors (ISVs) and service providers to extend the range of platforms and data source types they can reach, and the manner in which they build data integration processes.
  • Customer references indicate ETI's data transformation functionality, physical bulk data movement capabilities, and range of connectivity as its greatest strengths, providing extremely strong feedback on these points relative to other providers in the market. Product support and general satisfaction with the vendor relationship are also noted by customers to be positive.

  • Despite being owned by a larger entity with greater financial resources (the vendor was acquired by Versata in 2008), ETI continues to decline in market presence and brand recognition, as measured by its frequency of appearance in Gartner client inquiries and primary research studies of the competitive landscape. With Versata's strong focus on annuity revenue, it appears that the parent company is more likely to use ETI's technology to support other growing lines of business. As a result, ETI's market share and the existing skill base among customers and system integrators appear to be shrinking relative to the market leaders.
  • ETI appears to recognize critical areas of weakness, such as metadata management (metadata discovery, modeling and analysis/reporting) and ease of implementation and use (data flow design and visualization), and is beginning to take steps to address these points in its product road map. In particular, ETI received extremely low ratings from customer references regarding its metadata management functionality. Releases targeted for 2010 and beyond consist mostly of improvements to the graphical design environment. While the vendor acknowledges cloud computing trends in its terminology, current product plans include off-premises design and monitoring capabilities, but no leverage of cloud-based models for runtime execution.
  • While partnerships with ISVs in related markets, such as data quality tools, help to broaden ETI's functional reach, the vendor continues to lack proven strength in fundamental data integration capabilities that represent current and emerging market demand. For example, real-time and granular data flow based on CDC capabilities remains an area of limited use. In addition, ETI's sole focus is on physical data movement and integration, with no support for the composition of virtualized and federated views of data as an alternative data delivery style.


Armonk, New York, U.S.

Products: InfoSphere DataStage, InfoSphere Change Data Capture, InfoSphere Data Event Publisher, InfoSphere Federation Server, InfoSphere Replication Server, InfoSphere Foundation Tools

Customer base: estimated 8,000+

  • IBM continues to demonstrate strong vision in the market for extensive data integration capabilities, while also executing well in increasing the adoption of the various components within existing IBM customers and beyond. Organizations adopting InfoSphere tend to consider it the enterprise standard for data integration tooling, so IBM's customer references reflect more multi-project use and larger average numbers of data integration developers per customer than most of its competitors. The customer base also reflects a range of use cases, with an emphasis on data warehouse environments, but a growing base of customers is applying the tools against MDM, migration/conversion and operational data interface problems.
  • IBM continues to increase the level of integration between and consistency across the range of InfoSphere Information Server components. Significant deliverables in late 2008 and during 2009 included integrating the CDC technology (obtained in the DataMirror acquisition of 2006) with DataStage, the release of various Foundation Tools (for metadata management of various types, data profiling and industry models), and integrating DataStage with the InfoSphere MDM Server offering. Looking ahead, the product road map includes additional synergies with other IBM technologies, such as Optim for enabling data privacy rules in data integration processes.
  • During 2009, IBM finally saw significant adoption of the 8.x version of Information Server — among IBM reference customers and Gartner client interactions, approximately 65% of customers have migrated to the latest major release. However, 50% of the reference customers that have migrated claimed that the upgrade process was complex and costly. Reference customers report strong satisfaction with bulk/batch data delivery capabilities, data replication capabilities, the range of connectivity options, and metadata management functionality. Performance and scalability are also perceived as strong.

  • The IBM tools have a long-standing reputation for complexity. This manifests in an average time to implementation for IBM customers that is longer than the market average, and longer than that for other market leaders. Customer references often report a greater effort than expected for initial installation, and "bugginess" and stability issues were experienced by a significant number of sites. Despite these challenges a majority of customers indicate that they plan to procure additional products or licenses for deployed products from the InfoSphere portfolio in the next twelve months. Given the breadth of the vendor's functional capabilities, IBM's customers often address data integration problems of substantial scale and complexity, which require sophisticated capabilities.
  • Most customer references use the toolset for bulk/batch data delivery, granular CDC and propagation, and data replication. IBM states that the adoption of Federation Server is substantial (approximately 500 customers), although federation deployments appear infrequently in Gartner client interactions and within IBM-provided customer references. While IBM provides various integration points between the InfoSphere technologies and the WebSphere portfolio of process- and application-integration capabilities, customer references reflect limited use of these two brands in a synergistic manner. This represents a gap in execution relative to IBM's vision for information infrastructure, and an opportunity for the vendor to further enhance its value proposition and gain traction with existing customers as well as prospects.
  • IBM faces several non-technology challenges in this market. Many customers cite concerns regarding both the pricing method and the price point of IBM's tools in this market. While consistent with the pricing of other IBM product lines, the use of CPU speed as the main pricing parameter (which adds complexity for customers in auditing and modifying their implementation) and the relatively high cost of a typical implementation (compared with many of IBM's competitors) cause some prospects to consider alternative providers or limit their investment to a small number of components (DataStage only, for example). However, for midsize enterprises and those seeking a focused subset of the portfolio, the pricing approach may offer some advantages. In addition, customers and prospects often report challenges in locating resources in their local geographies that are highly skilled across the breadth of the company's product capabilities.


Redwood City, California, U.S.

Products: PowerCenter, PowerExchange

Customer base: 3,700+

  • Informatica continues to retain its position as the most recognized data integration tools brand, appearing in enterprise-scale competitive evaluations more frequently than all other providers in the market. The vendor continues to deliver high-quality product releases, well aligned with current market demand and evolving trends. Despite the challenging market conditions due to economic pressures and stronger competition from much larger infrastructure and application providers, Informatica has continued to grow, both organically and via acquisition.
  • The vast majority of customer references have established Informatica as their enterprise standard for data integration tools, with many applying the tools to large numbers of projects involving greater numbers of developers than the average. While nearly all of these customers apply the technology in the BI and data warehousing domain, a large percentage have additional use cases. Specifically, data migration/conversion initiatives and the delivery of interfaces between operational applications were seen in most customer references, reflecting more diversity of use cases within individual enterprises than for most of Informatica's competitors. Informatica's customer base continues to express a high degree of satisfaction with time to value, performance, product support, availability of skills, and overall experience in the relationship with the vendor.
  • Informatica continues to provide thought leadership in the market by exploring alternative delivery models for data integration capabilities. While still a minor component of its revenue relative to traditional on-premises deployments, the vendor's "on demand" offerings for have seen increasing adoption during 2009. The recent announcement of a beta offering of PowerCenter running on the Amazon Elastic Compute Cloud (EC2) cloud-based infrastructure and priced at an hourly rate, although it is yet to see any significant interest from buyers, represents an early example of how cloud computing might impact the economics of data integration solutions. The planned delivery of Informatica 9 in 4Q09 — which, in addition to expanded capabilities for delivering data services, brings together both data integration and data quality capabilities in a single runtime architecture — is aligned with the fast-moving trend of consolidation of these two classes of technology.

  • While Informatica's customer base reflects a diverse mix of use cases, deployment architectures remain heavily centered on bulk, batch-oriented data delivery, with an increasing prevalence of granular and low-latency patterns. Although virtualized and federated approaches to data integration are exhibiting a somewhat shallow growth curve, Informatica's historical lack of focus in this area puts it at a disadvantage relative to some of its competitors as demand increases. Informatica intends to address this gap by introducing enhanced data services capabilities in Informatica 9.
  • As consolidation occurs across various markets related to data management, Informatica's position as an independent provider of data integration tools creates opportunities, but it also brings challenges. The vendor faces even stronger competition from large application and infrastructure providers (IBM, Microsoft, Oracle and SAP), which are increasingly bundling data integration tools with their other offerings at limited additional cost to the customer. Informatica's moves to expand its functional reach via acquisitions and differentiate via new delivery models (such as on-demand and cloud-based) will help it to remain competitive, although it is increasingly facing pressure from these larger incumbents.
  • Numerous interactions with Informatica's prospects and customers highlight a sense of concern over the vendor's price points relative to the current state of IT budgets and software pricing trends. Despite having broader functionality (and therefore broader applicability), as one of the higher-priced solutions in this market Informatica continues to face increasing pressure from competitors with a smaller cost footprint or alternative licensing models. By offering SaaS- and cloud-based delivery models with subscription-based pricing, the vendor hopes to address this challenge. However, customer references rated their satisfaction with Informatica's pricing method and price points at below market average, and far lower than their satisfaction with most other aspects of their relationship with the vendor.

iWay Software

New York, New York, U.S.

Products: DataMigrator, Data Hub, Service Manager, Universal Adapter Framework, Information Management Suite

Customer base: 375+

  • A division of Information Builders, iWay Software creates and sells Information Builders' integration technologies, with the goal of building an integration software business that is independent of the BI capabilities for which Information Builders is well known. iWay offers capabilities for physical data movement and delivery (via its DataMigrator ETL tool), data federation (via the iWay Data Hub product) and real-time message-oriented integration (supported by the Service Manager product). Support for a broad range of data delivery styles, and the ability for customers to use the various products in an integrated, synergistic fashion, position iWay well in terms of both current and future market demand.
  • iWay's customers cite its significant ability to connect to a wide variety of sources on any platform, its complete support for standards, and the ease of exposing processing as callable services, and offer high praise for the development environment. Customers also report good de-bugging and easy troubleshooting.
  • Information Builders' size and global presence afford iWay a strong foundation from which to execute its growth strategy. Customer references cite the broad connectivity capabilities, the reasonable ease of implementation, and the integration with Information Builders' BI products (specifically WebFOCUS) as main drivers for their choice of iWay data integration tools. The recent delivery of data quality and MDM offerings provides additional synergies across the vendor's portfolio.

  • One reason for iWay's struggle to emerge as a leader may be because its customers generally feel that releases create upgrade issues. Some customers report that iWay releases good software with bugs — while iWay reacts quickly, it seems that more quality assurance (QA) in releases is warranted.
  • Customers' experiences with iWay's professional services are inconsistent. This is now a pattern across multiple years of the Magic Quadrant, and may be another key to iWay's current level of market acceptance. Customers often report that the tool itself is powerful in its functionality, but that the services organization does not measure up to the product. Customers who exhaust their own resourcefulness in using the de-bugging tools find it difficult to get their issues resolved by services staff.
  • iWay's product capabilities are well aligned with the evolving needs of the data integration tools market, but one of the vendor's biggest challenges is gaining recognition outside the Information Builders' customer base. iWay continues to suffer a low profile, rarely appearing in competitive situations versus the market leaders. If iWay can improve its services organization and the QA issues with its releases it will boost customer satisfaction. However, market awareness is a separate issue and iWay must address both. iWay continues to be a vendor that offers a great deal of functionality but which suffers in terms of its market presence and awareness.


Redmond, Washington, U.S.

Products: SQL Server Integration Services, BizTalk Server

Customer base: estimated 10,000+

  • Microsoft's main focus in the data integration tools market is on bulk/batch-oriented data delivery provided by SQL Server Integration Services (SSIS). SSIS is in broad deployment within the SQL Server customer base. Customer references cite low cost, ease of implementation (shorter than the market average time to deployment), capable ETL functionality, and tight integration with other SQL Server functionality as the main value points of SSIS. While they are not reflected in current customer implementations, Microsoft is beginning to leverage the text analytics capabilities developed by its Research group to help SSIS users leverage less-structured data sources in the context of data integration processes.
  • A recent restructuring of engineering and marketing functions inside Microsoft has resulted in a single division with responsibility for SQL Server, BizTalk and the recently acquired Stratature MDM technology. This new structure gives Microsoft the opportunity to deliver a clearer and more comprehensive vision for data integration (and broader data management) to the market.
  • Microsoft's size and global presence provide a huge customer base for best practices, a prevalent skill base, and a distribution model that supports both direct and channel partner sales. In addition, customer references generally report a very positive support and service experience, including product documentation and online support mechanisms.

  • While SSIS can be integrated with BizTalk and Microsoft can also address replication-style data delivery via SQL Server functionality, customer references reflect the substantial use of SSIS and almost no use of other data integration-related products or components. Few customers using both SSIS and BizTalk exhibit integration between and use of the products in a synergistic manner.
  • Customer references continue to cite metadata management capabilities (metadata discovery, modeling, model-to-model mapping, and impact analysis and lineage reporting) as a substantial weakness. Other functional gaps or weaknesses cited by customers include data federation capabilities and data quality/governance. While the vendor has improved support for connectivity to several non-Microsoft data source types via freely downloadable adapters for SAP, Oracle and Teradata, as well as broader connectivity via partners, support for diverse environments and non-bulk/batch interaction with data sources remains a relative weakness. Microsoft seeks to strengthen its capabilities for granular and low-latency CDC via the StreamInsight functionality of SQL Server 2008 R2 in 2010.
  • While the forward-looking product road map for Microsoft's data integration capabilities promises to fill significant gaps relative to the market leaders (such as stronger metadata management, data quality functionality and enterprise-class administration capabilities), these advances will not be available to customers until the next major release of SQL Server, expected in 2011. Furthermore, while the concept of multiple data delivery styles via a single product/platform is in Microsoft's long-term vision, the actual delivery of such capabilities is even further in the future.

Open Text

Waterloo, Ontario, Canada

Products: Open Text Integration Center

Customer base: estimated 500

  • Primarily focused on providing content management solutions, Open Text has renamed the former Genio product (which was developed by Hummingbird, a vendor that Open Text acquired in 2006), as Open Text Integration Center (OTIC). OTIC is positioned as a component of the Open Text ECM Suite, and is suited for the integration of both structured data and less-structured content. The ability to interact with document and content-oriented data sources and targets represents a significant differentiator for Open Text in this market, and with the vendor well positioned in the ECM market, uptake of OTIC will likely increase.
  • Customer references cite as functional strengths the depth of integration with Open Text's ECM products (specifically Livelink), strong transformation functionality, broad applicability, and ease of use. In addition, OTIC was viewed by some customer references as more attractive than the market leaders from a cost perspective.
  • The latest version of OTIC, 7.1, released in June 2009, closes a few of the functional gaps relative to the market leaders. Most importantly, in this release the vendor has added support for exposing the results of a data integration process as a Web service, so that OTIC can begin to be leveraged in SOA deployments. Additional enhancements include operational functionality such as 64-bit support for Windows, high availability and expanded parallel processing support.

  • While the repositioning of OTIC within the Open Text ECM Suite adds clarity and focus, it is likely to limit Open Text's ability to capture the attention of typical buyers in the data integration tools market, for whom ECM and content integration are generally not the main drivers. Open Text's presence in this market, as measured by Gartner's client inquiries and primary research studies of the competitive landscape, remains limited. Open Text will be challenged to increase its share of mainstream use cases for data integration tools buyers, such as data warehousing, large-scale structured data migrations/conversions, and interfaces between operational applications.
  • With no federation, replication or direct data quality support, the OTIC is much narrower in functionality than the market leaders. The vendor has chosen to focus on bulk/batch-oriented data delivery, which is most commonly needed in ECM-related activities. However, this will make it difficult for Open Text to capitalize on growing market demand for broader data integration capabilities.
  • Customer references indicate recent experiences of bugginess and instability with the latest major release, requiring more substantial troubleshooting effort and more frequent interaction with Open Text's technical support. Customer references also had concerns over the limited availability of OTIC skills in the marketplace, both in terms of system integrators and people available to hire for their organizations. Open Text is attempting to address this challenge via training programs delivered in the context of user and partner events. With the shift in strategic focus toward support of ECM, Open Text will have to pay extra attention to improving strained relationships with the traditional customers of its data integration tools.


Redwood Shores, California, U.S.

Products: Data Integrator, Data Service Integrator, Warehouse Builder, GoldenGate

Customer base: estimated 3,000+

  • Oracle's data integration capabilities are largely centered on two specific tools — Oracle Data Integrator (ODI) and Oracle Warehouse Builder (OWB), the base functionality of which is packaged with Oracle DBMS licenses. Oracle Data Service Integrator (ODSI), based on the former BEA AquaLogic Data Services Platform product, adds federation capability to Oracle's offering. In addition, Oracle's acquisition of GoldenGate Software, completed in October 2009, has the potential to add enterprise-class replication/synchronization capability to the suite. Through the combination of these various products, Oracle is increasingly well positioned to address customers' needs across the full range of their data integration requirements.
  • Adoption of both ODI and OWB continues to grow within the Oracle DBMS and applications customer base, most commonly in traditional ETL-style implementations in support of BI and data warehousing. Customer references cite complete functionality for ETL, tight integration with the Oracle DBMS (in the case of OWB), integration with other Oracle Fusion Middleware components and applications (in the case of ODI), and Oracle's overall market presence and viability as the main reasons for selecting these tools. References report that Oracle's data integration expertise is becoming more widely available in the marketplace.
  • Oracle continues to make the most of ODI and OWB within its various offerings, providing embedded ETL functionality with its applications, MDM and BI offerings. Additionally, 2009 saw the beginning of the data integration product road map that will bring ODI and OWB into a unified data integration product.

  • Oracle's customer references report specific difficulties with each of the tools that sometimes challenge their use as enterprise tools. For OWB, customers cite as issues a slow interface, weak change/migration control and some difficulty in accessing heterogeneous sources. Relative to prebuilt transformations in ODI, customers report that the Knowledge Modules have to be adapted by the developers and that this results in upgrade issues for these modules. To be fair, ODI is intended to be a more versatile tool which therefore enables developer flexibility.
  • Customers report that it is necessary to acquire multiple products to achieve all the desired functionality, which complicates pricing and drives up costs. However, customers can get pricing and licensing relief with the Oracle Data Integrator Enterprise Edition (ODIEE), which includes both OWB and ODI. Additional costs are associated with ODSI, the data-quality OEM of Trillium and other additional components.
  • As in 2008, Oracle needs to reduce the number of metadata repositories that underlie its various data management offerings. This will be critical to improve the level of integration across the various products, reduce architectural complexity, and increase adoption of the full product set.

Pervasive Software

Austin, Texas, U.S.

Products: Data Integrator, Metadata Manager, Integration Hub, DataCloud, DataRush

Customer base: 4,000+

  • Pervasive offers solid and attractively priced data integration tools that support bulk/batch-oriented data delivery, but which also provide capabilities for real-time messaging-style solutions and SOA. The broad range of data source and target type support — including packaged applications, popular SaaS application APIs (such as for, industry-standard message formats (such as EDI documents, X12, the Health Insurance Portability and Accountability Act [HIPAA] and Health Level Seven [HL7]), and semi-structured content repositories provided with the core products — represents substantial value for customers. By expanding its reach to address the diverse technology landscape common in large enterprises, and continuing to do so with an attractive cost model, Pervasive demonstrates good vision for this market.
  • Customer references reflect a very good balance of usage across the full range of common data integration use cases, with a particular emphasis on supporting interfaces between operational applications. Customers cite ease of implementation and ongoing use, scalability/performance, mapping/transformation capabilities, and the broad range of data source and target support via packaged connectors as the most significant functional strengths of Pervasive's offering. In addition, customers give attractive pricing and a positive overall experience with the vendor as key reasons for their selection and ongoing use of Pervasive.
  • Pervasive has developed a SaaS offering for its data integration capabilities, and a more recent cloud-based delivery model (Pervasive DataCloud). While ahead of mainstream market demand for such approaches to delivering data integration capabilities, Pervasive's initiatives in these areas further support its positioning of cost-effective solutions while demonstrating an expanded vision for the data integration tools market.

  • Pervasive's low pricing creates a phenomenon in which many customers license and use the tools in a variety of tactical situations, rather than making them an enterprisewide standard. Many customer references report numerous, disconnected implementations. While tactical implementations and a desire for quick results are a common theme in the current market, Pervasive needs to provide more capabilities and guidance to customers to achieve greater manageability in multi-project deployments.
  • Pervasive does not provide support for data federation, and real-time capabilities remain a relative weakness in comparison with some of its competitors. Other weaknesses, as cited by customer references, include metadata and modeling functionality, administrative capabilities (such as monitoring, tuning and error handling), and product documentation. Pervasive's product road map includes enhanced real-time support, metadata management and administration/monitoring enhancements in the forthcoming version 10 release. In addition, the bugginess of new releases was cited as a challenge by a number of customer references.
  • Pervasive's approach to the market using indirect channels (SaaS/cloud vendors, OEM relationships and resellers) understates its overall market presence. As a result, it struggles to establish itself as a corporate standard in large enterprises. While many buyers of data integration tools follow a consistency and standards approach, Pervasive offers a "choice" delivery model — it can be used as a standard or deployed tactically, implemented stand-alone or embedded in other solutions, and procured via resellers or directly. If an organization can use this model, it becomes a good fit.

Pitney Bowes Business Insight

Troy, New York, U.S.

Products: Data Flow

Customer base: 2,000+

  • Pitney Bowes Business Insight, a division of mailstream hardware and services vendor Pitney Bowes, competes in the data integration tools market via its Data Flow offering. In 2008, reorganizations provided the opportunity to drive the strategy and development of Data Flow (along with that of the former Group 1 data quality products) from the Pitney Bowes Business Insight unit, creating a consistent focus on data integration and data quality. In addition, the vendor's Spectrum Technology Platform seeks to align and coordinate all its key technologies, including Data Flow, into a harmonized solution for data management.
  • Data Flow primarily supports ETL implementation patterns, although limited data federation scenarios can also be achieved. Implementations generally reflect traditional ETL use cases (bulk/batch-oriented data movement) in the BI domain.
  • References report that the product is an easy to use, low-cost solution with good prebuilt functionality and special features to automate data processing and view the results in-stream. Customer references continue to indicate strong ease of use, fast implementation times and solid ETL functionality as the reasons for their selection, or continued use, of Data Flow.

  • Most customer references continue to reflect departmental implementations. Customer references report that this is "not an enterprise tool." Reasons given include: lack of multi-developer support, poor version support, reduced quality of software code, poor or missing administrative APIs for the data flow service and inconsistent performance on different operating systems.
  • Customers indicate that support and product quality have degraded in the past two years and that the product itself has seen little advancement in terms of features and functionality.
  • Pitney Bowes is rarely seen actively competing against the market leaders for new data integration tools opportunities at the enterprise level. Additionally, there is little market expertise available when it comes to delivering with this tool.
  • Data Flow is mostly limited to bulk/batch-oriented data delivery in support of ETL patterns. The current market demand is for multiple delivery styles, beyond bulk/batch, via a similar set of tools in a suite, if not by a unified application. While this represents a challenge for Pitney Bowes, ETL capabilities remain a mainstay component of demand, and the vendor has successfully leveraged Data Flow to support other offerings where these capabilities are critical (such as its customer communications management solutions).

SAP BusinessObjects

Palo Alto, California, U.S.

Products: Data Integrator, Data Federator, Data Services, NetWeaver PI

Customer base: estimated 4,000+

  • SAP BusinessObjects continues to hone its broad vision for how the various data integration tools in the former Business Objects portfolio will evolve over time and complement capabilities in the SAP NetWeaver platform to form a comprehensive information management infrastructure. With ownership of the product management and marketing for these tools remaining with the BusinessObjects unit of SAP, the vendor aims to address both the non-SAP applications market and the SAP applications customer base. The breadth of functionality available across the portfolio (a wide range of data delivery styles, plus options to integrate with data quality capabilities and SAP's MDM offering) continues to be attractive to SAP's customers and prospects.
  • Feedback from customer references indicates that the most significant strengths of SAP BusinessObjects' offering are: the ease of implementation and ongoing ease of use of Data Integrator for ETL architectures; the richness of built-in transformation functionality; and the range of available adapters.
  • SAP BusinessObjects' Data Services, an offering that combines the ETL capabilities of Data Integrator with Data Quality in a single runtime platform, is beginning to gain traction in the market. Approximately 25% of the customer references recently surveyed had migrated to Data Services. The combination of data integration and data quality capabilities is consistent with demand trends and the ongoing convergence of what are currently two discrete but increasingly overlapping markets.

  • Despite a generally positive perception of the capabilities and value of the technology, many SAP BusinessObjects customers report a decline in quality of service and support, as well as frustrations regarding pricing and licensing issues. Many customer references rated their overall experience in the relationship with SAP BusinessObjects as below their expectations, specifically emphasizing support and pricing concerns. The vendor has taken specific actions, including the expansion of support forums and incident management workflow, to help address support-related issues.
  • SAP BusinessObjects has established a vision for how the former Business Objects tools (Data Integrator, Data Federator and Metadata Management will be integrated and rationalized with SAP's own data integration capabilities (the extractors for SAP Business Warehouse and NetWeaver PI). However, the vendor needs to more clearly and consistently communicate this vision to its customers and prospects, taking care that its messaging matches buyers' needs. Concerns regarding the degree to which Data Integrator, Data Federator and Data Services will remain "environment-agnostic" (providing equal support for both SAP and non-SAP data structures and applications) have caused some customers and prospects to limit their investments.
  • Implementations of SAP BusinessObject's data integration tools reflect a significant bias toward bulk/batch-oriented data delivery (such as ETL architectures). The customer base shows relatively limited adoption of Data Federator, and extremely limited use of real-time and granular data delivery capabilities. The vendor must continue to develop competency and proof points regarding the full range of data delivery styles (for example, via bringing the Data Federator capabilities into the Data Services offering) in order to better align with evolving market demand.


Cary, North Carolina, U.S.

Products: Enterprise Data Integration Server, Data Integration Server, SAS for Data Migration, SAS/ACCESS, DataFlux Integration Server

Customer base: estimated 12,000

  • From an execution perspective, SAS's size, global presence and long experience supporting data integration activities give it a solid position. The resulting product delivery model includes positive reviews of its product support and service experiences worldwide. SAS's primary product in this market is the SAS Enterprise Data Integration Server, which provides bulk/batch-oriented data delivery and data federation, including packaged transformations, metadata management, parallel processing, load balancing and, through the vendor's DataFlux subsidiary, rich data quality functionality (profiling, cleansing, matching and enrichment). The vendor has established a strategic vision and direction of harmonizing the SAS and DataFlux technologies. Referred to as the "Project Unity" initiative, the vendor aims to deliver a unified data management platform that combines data integration, data quality and MDM capabilities.
  • The Enterprise Data Integration Server has connectivity to virtually every data source, including packaged applications, data warehouse appliances and many data sources on the mainframe. SAS can run on every major operating system, including various Unix and Linux flavors and z/OS, and it can connect to SaaS data as well as to various data warehouse appliances.
  • Customer references report that the tool is highly practical for data integration capabilities, using terms such as "leading edge" to describe the functionality. This indicates that SAS has a thorough understanding of its customer base and matches the product's capability to most of its customers' demands. Customers report good metadata management as well as metadata-driven rules/transformation engines (supporting impact analysis and change management), integration with other SAS products and rapid implementations. Users also report strong visual workflow management tools and a high quality of SAS professional services.

  • SAS's customer references reflect a wide diversity of use cases, industries and implementation styles, but also varied experiences. Customer references provide inconsistent feedback on the ease of deployment/implementation. Some customers report that the price is high and that obtaining value for the price paid is difficult, while other customers indicate good value. This indicates that customer experiences are highly individualized, so success depends greatly on the customer's skills and the vendor's support capability.
  • SAS's user experiences are beginning to exhibit market adoption characteristics similar to other vendors with wide adoption in the market. Reports of complexity which creates a high learning curve, a difficulty in identifying qualified implementation resources and uneven customer experiences of support are symptoms of a product offering with varying levels of complexity in terms of the problem that customers are trying to solve. This is common in advanced tools in the market, but organizations should be prepared to develop their own expertise, and should also be aware of the associated learning curve.


Dublin, California, U.S.

Products: Replication Server, Sybase ETL

Customer base: 2,600+

  • Sybase is currently focusing on the very targeted use of two dimensions of data integration — ETL and replication. Most widely used is the Sybase Replication Server product, which supports heterogeneous database replication involving a variety of non-Sybase DBMS types, in addition to providing strong support for Sybase Adaptive Server Enterprise (ASE). The vendor's size, global presence and large installed base in key sectors, such as financial services, government and telecommunications, create opportunities for it to execute in this market. Customer references indicate a positive overall experience with the vendor, and cite ease of implementation and product support as strengths.
  • Sybase's ability to target specific use cases represents a good vision for the near-term buying trends in the market. Firstly, Sybase is focusing on providing improved capabilities for customers to deliver data to Sybase IQ — the ETL functionality currently supports this, and the vendor plans to enable Replication Server to support real-time propagation of data to Sybase IQ in the future. Given Sybase's limited mind share in the data integration tools market, this remains a pragmatic near-term strategy, which is aligned with current market demand. Secondly, Sybase has begun to link its data integration capabilities to its mobility strategy, considering mobile devices to be an important target for data delivery. With its strength in mobile solutions, this represents a unique extension of market vision and a competitive differentiator relative to the rest of Sybase's competition.
  • Sybase uses PowerDesigner, its flagship modeling and design product, as the common design interface and focus of metadata across the various components of its data integration and broader data management portfolio. The vendor's product road map for Replication Server includes expanding heterogeneous support, with the next release (v15.5) providing current-version support for SQL Server, DB2, Oracle and Sybase DBMSs. In addition, the recently released current version of Sybase ETL (v4.9) offers richer parallel/grid capabilities and tighter integration with Sybase IQ.

  • In terms of being recognized as a competitor in this space, Sybase continues to lag behind the market leaders and other large vendors. This is partly due to its focus on a more technical marketing and sales execution, which does not always resonate with buyers that view data integration capabilities as a strategic foundation for their information management work. The lack of focus on service-oriented deployment, data federation and general-purpose ETL (Sybase ETL only supports Sybase IQ databases as a target) will continue to detract from the vendor's ability to compete well against market leaders.
  • Customer references indicate gaps in various aspects of Sybase's metadata management and modeling, as well as in supporting complex implementations. However, nearly all of these customers were using Replication Server only (which has limited metadata management capabilities), without the benefit of PowerDesigner. Sybase continues to show substantial weakness in providing proof points for its bulk/batch data delivery capability, which is the most highly demanded style of data delivery in this market. This is demonstrated by customer references and Gartner client inquiries, which reflect extremely limited adoption of Sybase ETL. To address this issue, Replication Server v15.2 added specific techniques to address bulk/batch data delivery, and Sybase intends to enhance this functionality in the upcoming v15.5 release. Because of the massively skewed nature of Sybase's data integration tools installed base toward Replication Server, customers rarely acknowledge data quality/governance capabilities (which Sybase provides via partnerships with DataFlux and IBM) as important in their implementations.
  • Sybase's decision to focus on the very targeted use of only two data delivery styles is somewhat at odds with longer-term trends in the data integration tools market. Organizations are increasingly seeking comprehensive suites of data delivery styles deployed in an enterprisewide fashion, and structured in the form of data services. Organizations that seek specific point solutions or more tactical and project-specific roles for data replication and ETL can benefit by following the Sybase strategy, but they should realize that the vendor may be slow to adapt to wider needs in the future.


Woodcliff Lake, New Jersey, U.S.

Products: DMExpress

Customer base: 700+

  • Syncsort continues to gain traction in the data integration tools market, with the customer base for DMExpress showing rapid growth during 2009. Contemporary demand for tools with a lower-cost footprint, short time to implementation and targeted functionality with ETL capabilities at the core is helping to fuel this growth. It is exactly these characteristics (lower price compared to the market leaders, ease of use, strong performance for ETL workloads) that Syncsort customer references cite as their main reasons for selecting DMExpress.
  • With 40 years of experience in high-performance data processing, sustained profitability and a large and loyal customer base, Syncsort has a solid base on which to grow its market presence. The recent and ongoing changes to Syncsort's management team (including a new CEO and sales leader) are increasing the operational and sales effectiveness of the organization. The vendor has a global presence, with approximately 40% of the installed base of DMExpress outside North America.
  • DMExpress users often have investments in tools from the market leaders or other competitive vendors, and they use Syncsort's technology to fine-tune the performance of end-to-end processes supported by such vendors. Most Syncsort customer references indicate that another provider's technology is their standard for data integration tools, and that DMExpress plays a role in augmenting or extending their primary investment. Recent partnerships with vendors offering extended functionality (for example, Attunity for CDC and legacy data source connectivity, and Trillium Software for data quality) should help Syncsort expand beyond its "faster ETL" niche. Overall experience with the vendor and product support are cited by customer references as additional strengths.

  • Syncsort has two major requirements it must address to improve its execution in the data integration tools market. Firstly, it must further expand its support for delivery styles beyond bulk/batch ETL. It does not directly support additional styles via its own technology; the Attunity partnership (for add-on real-time CDC functionality) is at a very early stage; and a relationship with Composite Software for data federation remains unproven in actual implementations. Secondly, while Syncsort will deliver expanded metadata management server capabilities in a forthcoming release, this functionality will provide the basics. Customer demand for richer modeling, metadata discovery and the dynamic leverage of metadata for optimized data flow continues to grow, creating an ongoing challenge for the vendor to keep pace. Customer references rate metadata functionality, real-time data delivery and service enablement below the market average across the competitive landscape.
  • In addition to delivering functional improvements, to compete well in the longer-term Syncsort must expand its vision of data integration beyond just the physical bulk movement of data. In addition, the vendor must address a broader set of customer demand trends beyond high-performance (for example, the ability to deal with greater levels of complexity in business rules for data transformation, and the ability to understand and leverage less-structured data sources).
  • While Syncsort's customer references are very satisfied with the price points of the software, they express dissatisfaction with the complexity of the pricing model and the governance/auditing of licenses (license keys being linked to specific physical hardware assets, and the limitations/costs associated with changes in the hardware environment). The vendor plans to deliver capabilities for self-serve license keys to address some of these challenges. Customers also consistently indicate that the product documentation needs improvement.


Los Altos, California, U.S. and Suresnes, France

Products: Talend Open Studio, Talend Integration Suite, Talend Integration Suite MPx, Talend Integration Suite RTx

Customer base: 800+

  • Talend positions itself well, with both a subscription-based data integration product (Talend Integration Suite) and a freely downloadable open-source offering (Talend Open Studio), to appeal to different segments of the data integration tools market. Gartner has seen increased interest in both of Talend's offerings, and the vendor is gaining mind share in the market.
  • Almost unanimously, reference organizations report positive results with Talend's technology. This is a qualified response, in that price is always compared to the depth of features/functions available, and organizations mention that with Talend they are tolerating less functionality than they would otherwise acquire in competitive tools. While marketing focuses on the open-source market in general, references rarely indicate that the open-source offering and its related cost model are driving factors for choosing the tool. However, price is very much a driving factor in terms of initial interest. Once the initial interest is achieved, the tool also delivers good functionality. Gartner believes that the features/functionality of Talend's tools are also evaluated relative to pricing, and that the features and functionality begin to stand on their own merits — for either the open-source or the subscription version.
  • Good connectivity is reported, as well as significant compatibility in Java and open-source environments in general. Additional functionality is achieved using Talend's recently released profiling and data quality capabilities. Additionally, while version migration is reported as an issue in some cases, organizations say that moving from an open-source development environment to the licensed version requires no additional learning curve, and that product support is consistent during implementation.

  • Gartner frequently reports that brand recognition and viability (in the form of not only financial assets but also organizational strengths, customer base and the availability of skills) is an issue for new entrants to any market, and some customers tend to ignore the impacts. The consultant community is not yet well-versed in Talend's tools, making it difficult to find expertise. However, Talend is increasing the number of system integrator relationships it has. Brand recognition also reduces the user base experience — this results in many minor bugs reported by users that become bothersome overall, even though individually they are not major issues. Also, relative to the user base and market acceptance, Talend has established a presence in Europe, the Middle East and Africa (EMEA) and North America, but has little presence so far in other global regions.
  • Talend is deployed with a strong bias toward bulk/batch-oriented data delivery as a data integration strategy, and more market experience is needed with other methods of data integration. In addition, end users report issues with the central repository, ranging from "slowness" when using it as a coordination point for development (Talend's latest version includes capabilities specifically aimed at addressing this issue) to erratic behavior in the persistence of code objects. To temper this, some organizations report good centralized management in production — including the deployment and management of remote services.
  • Clients report weak documentation and immaturity in the metadata, and also that they are expected to have Perl or Java experience/expertise. End users indicate a reliance on other open-source components, resulting in some version-migration issues (for example, Tomcat and MySQL). Some organizations report that the move from the open-source tool to the licensed tool creates pricing complexity — this is most likely due to the mix of open-source and licensed tools.

The Magic Quadrant is copyrighted 25 November 2009 by Gartner, Inc. and is reused with permission. The Magic Quadrant is a graphical representation of a marketplace at and for a specific time period. It depicts Gartner's analysis of how certain vendors measure against criteria for that marketplace, as defined by Gartner. Gartner does not endorse any vendor, product or service depicted in the Magic Quadrant, and does not advise technology users to select only those vendors placed in the "Leaders" quadrant. The Magic Quadrant is intended solely as a research tool, and is not meant to be a specific guide to action. Gartner disclaims all warranties, express or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

© 2009 Gartner, Inc. and/or its Affiliates. All Rights Reserved. Reproduction and distribution of this publication in any form without prior written permission is forbidden. The information contained herein has been obtained from sources believed to be reliable. Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information. Although Gartner's research may discuss legal issues related to the information technology business, Gartner does not provide legal advice or services and its research should not be construed or used as such. Gartner shall have no liability for errors, omissions or inadequacies in the information contained herein or for interpretations thereof. The opinions expressed herein are subject to change without notice.

Vendors Added or Dropped

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor appearing in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. This may be a reflection of a change in the market and, therefore, changed evaluation criteria, or a change of focus by a vendor.

Evaluation Criteria Definitions

Ability to Execute

Product/Service: Core goods and services offered by the vendor that compete in/serve the defined market. This includes current product/service capabilities, quality, feature sets, skills, etc., whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.

Overall Viability (Business Unit, Financial, Strategy, Organization): Viability includes an assessment of the overall organization's financial health, the financial and practical success of the business unit, and the likelihood of the individual business unit to continue investing in the product, to continue offering the product and to advance the state of the art within the organization's portfolio of products.

Sales Execution/Pricing: The vendor's capabilities in all pre-sales activities and the structure that supports them. This includes deal management, pricing and negotiation, pre-sales support and the overall effectiveness of the sales channel.

Market Responsiveness and Track Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor's history of responsiveness.

Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization's message in order to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This "mind share" can be driven by a combination of publicity, promotional, thought leadership, word-of-mouth and sales activities.

Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements, etc.

Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.

Completeness of Vision

Market Understanding: Ability of the vendor to understand buyers' wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen and understand buyers' wants and needs, and can shape or enhance those with their added vision.

Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.

Sales Strategy: The strategy for selling product that uses the appropriate network of direct and indirect sales, marketing, service and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.

Offering (Product) Strategy: The vendor's approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature set as they map to current and future requirements.

Business Model: The soundness and logic of the vendor's underlying business proposition.

Vertical/Industry Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including verticals.

Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes.

Geographic Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the "home" or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.