Magic Quadrant for Data Warehouse Database Management Systems

7 March 2014 ID:G00255860
Analyst(s): Mark A. Beyer, Roxane Edjlali

VIEW SUMMARY

Entering 2014, the hype around replacing the data warehouse gives way to the more sensible strategy of augmenting it. New competitors have arisen, leveraging big data and cloud, while traditional vendors have invested — which will force improved execution from new technology companies.

Market Definition/Description

For the purposes of this analysis, users refer to vendors/suppliers much more frequently than product names. If a vendor markets more than one DBMS product that customers use as a data warehouse DBMS, we note this in the section specific to that vendor. This is especially important because the influence of the logical data warehouse (LDW, see Note 1 for a definition) has created a situation in which multiple repository strategies are now expected, even from a single vendor. Strengths and cautions relating to a specific offering or offerings, when noted by customers, are also made clear in the individual vendor sections.

For this Magic Quadrant, we define a DBMS as a complete software system that supports and manages a database or databases in some form of storage medium (which can include hard-disk drives, flash memory, and solid-state drives or even RAM). Data warehouse DBMSs are systems that can perform relational data processing and can extended to support new structures and data types, such as XML, text, documents, and access to externally managed file systems. They must support data availability to independent front-end application software, include mechanisms to isolate workload requirements (see Note 2) and control various parameters of end-user access within managed instances of the data.

A data warehouse is a solution architecture that may consist of many different technologies in combination (see Note 3). At the core, however, any vendor offering or combination of offerings must exhibit the capability of providing access to the files or tables under management by open access tools. A data warehouse is simply a warehouse of data, not a specific class or type of technology.

In 2014, this Magic Quadrant introduces non-relational data management systems for the first time. No specific rating advantage is given regarding the type of data store used (for example, DBMS, Hadoop Distributed File System [HDFS]; relational, key-value, document; row, column and so on). All vendors are expected to provide multiple solutions (although one approach is adequate for inclusion), each demonstrating maturity and customer adoption. Also, cloud solutions (such as platform as a service) are considered viable alternatives to on-premises warehouses.

A data warehouse DBMS is now expected to coordinate data virtualization strategies, and distributed file and/or processing approaches, to address changes in data management and access requirements.

There are many different delivery models, such as stand-alone DBMS software, certified configurations, cloud (public and private) offerings and data warehouse appliances (see Note 4). These are also evaluated together in the analysis of each vendor.

To be included in this document's analysis, any acquired DBMS product must have been part of the vendor's product set for the majority of the calendar year in question — in this case, 2013. If a product has been acquired from another vendor, customers consider the acquisition to be a separate product for at least six months; therefore, in order for an acquisition to be considered as part of the same vendor, it must have occurred before 30 June 2013. If any vendor or product was acquired midyear it will be represented by a separate dot until publication of the following year's Magic Quadrant.

Magic Quadrant

Figure 1. Magic Quadrant for Data Warehouse Database Management Systems
Figure 1.Magic Quadrant for Data Warehouse Database Management Systems

Source: Gartner (March 2014)

Vendor Strengths and Cautions

1010data

1010data (www.1010data.com) was established 14 years ago as a managed service data warehouse provider with an integrated DBMS and business intelligence (BI) solution primarily for the financial sector, but also for the retail/consumer packaged goods, telecom, government and healthcare sectors. Over 500 customers use 1010data's database or solution — an increase of 125 customers from 2012.

Strengths
  • 1010data has a product and methodology for software and platform as a service. 1010data's clients can choose to "pool" their data for market analysis — a well-executed vision of cloud before cloud existed.
  • 1010data has a consistent management team and strong verticals, adding the retail and gaming verticals in 2013 to its existing financial services and banking.
  • Reference customers report positively on this company's speed, performance and scalability. 1010data has developed its own approach to very large data analysis, creating some advantages for its targeted customers.
Cautions
  • 1010data's customer base continues to be U.S.-centric, with a facility in the U.K. The new verticals have helped, but the company remains small — only a regional competitor with a highly focused market understanding.
  • Customers report an expert-level user interface. 1010data says it will be releasing new visualization tools in 1Q14. Customers also report an uneven quality of support, which often does not meet the desired response times.
  • Customers have concerns with missing functionality regarding complex database operations and integration with other data sources and systems.

Actian

Actian (www.actian.com) offers symmetric multiprocessing and massively parallel processing (MPP) products: the commercially licensed Vectorwise for analytic data warehouses (June 2012); and the general-purpose, open-source Ingres DBMS (May 2012). Actian acquired ParAccel and Pervasive Software in April 2013. Actian claims more than 150 customers for data warehousing; Ingres DBMS is more than 30 years old with approximately 10,000 customers.

Strengths
  • Actian made two key acquisitions (ParAccel and Pervasive Software) during 2013. The Pervasive acquisition holds the promise of high-speed operational analytics, and we believe both acquisitions will help customers address an embedded operational analytics market.
  • Actian has reference architectures with hardware vendors (such as Dell, Cisco and NetApp) to deliver infrastructure specifications for do-it-yourself platforms. Its DBMS products also run on different types of commodity hardware and a dedicated SKU from Cisco.
  • Actian's MPP offering is priced per node, but also offers what it calls "right to deploy" licensing — whereby, customers can deploy a use-case-based unlimited node license for a specified period of time and a specific use case.
  • Actian is achieving incremental revenue as some customers using Redshift migrate their cloud-deployed data warehouses to on-premises.
Cautions
  • Some references report difficulty when migrating from Redshift to on-premises, in regard to setting up production in the warehouse.
  • Customers report mixed issues for Vectorwise and ParAccel — manual sorting/indexes, dropping support of previously enabled functions (T-SQL), and commit cost. Issues related to supporting user-defined functions and weak management tools are also reported.
  • Customers report an inordinate number of software bugs (causing improper execution of functionality), which, together with missing functionality, limits their customer satisfaction.

Amazon Web Services

Amazon Web Services (AWS) (aws.amazon.com) offers Amazon Redshift, a data warehouse service in the cloud, AWS Data Pipeline (designed for orchestration with existing AWS data services) and Elastic MapReduce. AWS's Redshift data warehouse has more than 2,000 customers.

Strengths
  • Amazon has the highest customer satisfaction and experience rating in this year's survey.
  • Customers report its strengths as being fast to deploy, low cost and fully elastic. Customer expectations for time to deliver are becoming more demanding and AWS is positioned to take advantage of this change (one AWS customer complained that 40 minutes was too long for deployment).
  • Deployment rates are at more than 300 new analytics databases monthly. In addition to permanent warehouses (one reference customer reports a more than 100-terabyte warehouse), many of the deployments are strategically small and intentionally short-lived — something that is a specific to cloud demand.
Cautions
  • Redshift is a basic system. Customers report that modeling for multitenant scenarios needs better tooling. A lack of more advanced functionality is also reported (for example, stored procedures and integrity constraints).
  • Combining on-premises data with cloud managed data is awkward. Customers report that they use Internet upload, but then shift to AWS Import/Export when Internet upload is slow.
  • AWS's view of data warehousing is very traditional and, as such, will begin to compete with traditional vendors more directly. It is not possible to estimate long-term customer loyalty, given that availability started in November 2012 with a formal launch in February 2013.

Cloudera

Cloudera (www.cloudera.com) provides a data storage and processing platform based upon an Apache Hadoop open-source software framework, as well as proprietary system and data management tools for design, deployment, operation and production management. Cloudera offers retail licensing and various annual subscriptions; it currently has just over 1,000 customers.

Strengths
  • Cloudera can process unstructured data (that is, unfamiliar schemas), structured (that is, familiar and well understood schemas) and data types that are in between (such as XML) with a variety of solutions such as batch computer (MapReduce), interactive SQL (Impala) and text search (Cloudera Search).
  • Examples of Cloudera capabilities include the training and scoring of predictive models via push-down to SAS and R (programming languages) and offering a wide array of prepackaged machine-learning libraries.
  • Reference customers report high confidence in Cloudera's personnel, their specific skills in deploying Hadoop distributions, and in the Cloudera-developed intellectual property (IP) — Impala, for example.
Cautions
  • Cloudera is vying for a spot from which to lead market execution in the era of big data. However, the megavendors have developed competing capabilities, have greater R&D funds or will acquire the technology.
  • Cloudera entered into this market with a professional services delivery model in 2008, and shifted to an IP model in 2011. At the same time, it is moving forward with its "data hub" alternative to traditional data warehousing (a form of LDW). This is a diverse model that will be a challenge for its bandwidth to deliver.
  • Customer references report some issues with security (for example, Sentry). Missing functionality is also reported (such as missing secondary indexes) and skills are difficult to locate in the market (for example, system administration and qualified engineers).

Exasol

Exasol (www.exasol.com) is a small DBMS vendor based in Nuremberg, Germany, and has been in business since 2000. Its first in-memory column-store DBMS, EXASolution, became available in 2004. EXASolution is still used primarily as a data mart for analytic applications, but also occasionally in support of LDWs. Exasol reports 45 customers.

Strengths
  • Customers continue to praise Exasol for its performance without requiring any tuning, which leads to a lower cost of ownership.
  • Exasol is in the early stages of supporting LDW-style deployments, offering connectors for data virtualization to various data sources including HDFS and Java Database Connectivity (JDBC), and support of user-defined functions in Lua, Python and R. It delivered new in-database data mining capabilities, expressed in SQL, in 2013.
  • Exasol has entered new geographies such as Israel, Brazil and Poland through partnerships and joint ventures, which has helped in growing its installed base from 38 to 45 customers.
Cautions
  • Exasol will be facing greater competitive pressure with all major vendors also offering in-memory DBMS (IMDBMS) capabilities. Moreover, Exasol is squarely positioned on analytical scenarios where other IMDBMSs are attempting to bridge across operational DBMS use cases.
  • Exasol reference customers continue to report missing functionality for administration and monitoring, as well as challenges for managing version upgrades.
  • Exasol is beginning to address a previously conservative go-to-market strategy. It is present in 11 countries and its growth remains slow outside of Germany despite its positive product track record.

HP

HP's (www.hp.com) portfolio is anchored by Vertica, a column-store analytic DBMS it acquired in 2011. Vertica is delivered as software for standard platforms (excluding Windows), and as a Community Edition (free for up to 1TB of data and three nodes). Also available are the HP AppSystems for Vertica, cloud versions (via HP Cloud Services and Amazon Machine Image offerings), and Data Warehouse On Demand — a hosted, managed service. HP Factory Express (in predefined certified configurations) is available for Vertica. HP reports more than 2,500 Vertica customers across all channels.

Strengths
  • Reference customers specifically mention their satisfaction with price/performance and overall value. They also report deployments ranging up to the petabyte scale — further indication of performance.
  • HP has gained on the leading vendors in execution during a year when those vendors were pulling away from almost everyone else. Customer count is up 11% as of November 2013.
  • HP's HAVEn, which combines security, Hadoop and unstructured data via Autonomy, is an innovative platform vision for analytics that HP plans to make available via its development partners.
Cautions
  • According to our inquiries, Vertica continues to have only limited visibility in competitive situations with Gartner end users — despite its good product offering.
  • HP's customers indicate that the delivery model, which claims tight coordination of professional services support and the community, is fragmented and not easily leveraged.
  • HP Vertica's reference customers say it is lacking in capabilities for database management and administration, but customers are uncertain whether they should expect such administrative tools when using a column database.

IBM

IBM (www.ibm.com) offers stand-alone DBMS solutions, as well as data warehouse appliances and a z/OS solution. Its various appliances include: IBM zEnterprise Analytics System, PureData System for Analytics, IDAA, IBM Smart Analytics System, and others. IBM offers data warehouse managed services and professional services.

Strengths
  • IBM has delivered products that support the LDW and is a leader in execution. A rearchitected Puredata and the new BLU (IMDBMS) were released in 2013. IBM reports increased market share in the two most recently completed quarters.
  • IBM offers all five form factors for data warehouses: software only, managed services, appliances, cloud and reference architectures. IBM partnerships and channels are highly prolific in terms of ability to provide local support. IBM utilizes direct staff, distributors and partners.
  • References focus on PureData's features, specifically mentioning ease of implementation, and are confident in the future of the platform. The analytics accelerator (IDAA) for System-Z is also praised.
Cautions
  • Gartner inquiry data does not show any increase in IBM's competitive presence. At the same time, IBM reports new customer wins for data warehouse products. This makes it unclear as to IBM's current ability to grow outside its currently very large customer base.
  • IBM has, for the past four years, guided the market into an LDW vision and offered an approach to the architecture. However, during the past eighteen months other vendors have been offering their own solutions, which are being received well by implementers in the field.
  • IBM's product marketing makes it difficult to determine which solution is fit for which purpose. For example, renaming Netezza to Pure Data for Analytics (customers still call it Netezza) or a lack of clarity between BLU and Pure Data for Analytics.

InfiniDB (formerly Calpont)

Calpont, a private corporation based in Frisco, Texas, U.S., renamed itself InfiniDB (www.infinidb.co) on 10 February 2014. The company launched its InfiniDB offering, an analytic columnar DBMS platform with parallel query performance, in February 2010. It currently has fewer than 50 named customers. This is a software solution — InfiniDB does not offer appliances, but it does work with partners offering prebuilt solutions.

Strengths
  • An open-source solution offered with a community edition based on MySQL. Customers state, "It is easy to adopt InfiniDB and, later, the enterprise edition" — to get more complete administration and management capabilities.
  • Customers report high levels of satisfaction, specifically citing that deploying analytics on MySQL datasets is significantly improved.
  • As of version 4.0, InfiniDB offers direct access into HDFS — using this as the file system for its database. InfiniDB positions itself as a direct competitor to Cloudera's Impala or Pivotal's Hawq.
Cautions
  • InfiniDB is questionable in its overall viability. With a small customer count and a significant interest in R&D, it is best classified as a startup — except that it is almost 20 years old.
  • So far, InfiniDB references indicate that most of the deployments remain small — indicating they are mainly used as data marts loaded in batch mode. The enterprise edition price is high, according to its customers.
  • Customers cite query and load/update issues. A query cache (similar to what MySQL has) is lacking. Bulk loading is the recommended process and customers specifically advise avoiding any form of inserts and updates. A lack of modeling tools and administrative capability are also cited.

Infobright

Infobright (www.infobright.com) is a global company with a steadily increasing customer base for its column-vectored, highly compressed DBMS. With open-source (Infobright Community Edition [ICE]) and commercial (Infobright Enterprise Edition [IEE]) versions, the company also announced a new Infopliance database appliance in September 2012. Infobright reports 450 customers, a 33% increase over its 2012 numbers.

Strengths
  • Straightforward pricing, ease of implementation and availability as an appliance are all considered strong points by its customers.
  • Customers report that they specifically recommend the use of the column capability to enhance open-source, row-based data for analytics. The resulting speed and performance is advantageous.
  • A successful niche focus: as in previous years, Infobright continues to focus on machine-generated data in specific industries such as telecom. It has seen the size of initial deployments grow from 10TB two years ago to 20TB today.
Cautions
  • More vendors are including a Hadoop distribution and using in-memory databases that address machine data well, which is Infobright's best market.
  • Most organizations report that small numbers of users use the Infobright warehouse on a monthly basis and that they perform infrequent reporting and analysis.
  • Customers report that Infobright is hard to scale and that ad hoc data management (that is, inserts, updates and deletes) are challenging. The learning curve to optimize the system is steep and some of the functionality to support optimization is lacking (for example, "query: explain" is absent).

Kognitio

Kognitio (www.kognitio.com) started out offering Whitecross in 1992 and a managed service in 1993. It now has customers using the Kognitio Analytical Platform either as an appliance, a data warehouse DBMS engine, data warehousing as a managed service (hosted on hardware located at Kognitio's sites or those of its partners), or as a data warehouse platform as a service using AWS. Kognitio has fewer than 50 customers.

Strengths
  • Kognitio's new customers chose its cloud deployment 64% of the time. In our big data adoption survey, cloud computing was identified as the top technology to get value from big data; so, Kognitio's decision to pursue this market is a positive change in its go-to-market strategy.
  • Kognitio offers analytics on top of Hadoop, as well as analytical "sandboxes" supporting data science labs. Its overall approach (for these sandboxes) is supported by Hadoop integration, through database integration of R and Python commands. This broad mix of solutions increases its overall vision.
  • Reference customers continue to praise Kognitio for its performance. Kognitio is an in-memory database that has been in the market for many years and can provide an alternative to solutions from megavendors.
Cautions
  • Kognitio has too many options for its small customer base. Big data is driving some analytics to the cloud, but traditional warehouse customers are not there yet.
  • Kognitio is losing some of its differentiation. It was a pioneer of in-memory databases and one of the first vendors to offer data warehouse "as a service." It is facing greater competition from new entrants and entrenched large vendors.
  • Despite having a good product and innovative pricing strategies, Kognitio's customer base is small and regional, and its customers indicate that they are concerned about its overall viability (due to its size).

MarkLogic

MarkLogic (www.marklogic.com) was founded in 2001 and offers a NoSQL database that utilizes XML storage and offers a strong metadata-driven semantic access management layer. It currently has more than 240 customers.

Strengths
  • The database is built to scale on commodity hardware and has full-text, faceted search, and atomicity, consistency, isolation and durability (ACID)-compliant transactional updates. Web services and SQL access are supported. Indexes of stored data are maintained simultaneously with data. MarkLogic can read or ingest HDFS data and, when ingested, uses the same indexing structure then emulates MapReduce-type optimized processing.
  • MarkLogic has a broad customer base, including Global 2000 companies as well as smaller organizations.
  • Customers report MarkLogic's strengths as scalability and the capability to use tightly structured schemas, or to treat assets as schemaless and access them through semantic capability.
Cautions
  • A low customer count after 13 years is not always negative; it can be argued that MarkLogic was initially ahead of the demand. Heterogeneous access is a latent demand and semantic access is one solution; however, it is unclear that the current demand for semantic access will translate into growth.
  • Customers report that the learning curve is steep and the market lacks available skills. MarkLogic customers also report low scores for overall customer experience, citing version upgrade issues (for example, no rollback, or bugs).

Microsoft

Microsoft (www.microsoft.com) markets SQL Server 2012 (Service Pack 1 has been available since November 2012), a reference architecture and the parallel data warehouse appliance. Microsoft does not report customer or license counts. Gartner estimates Microsoft's relational DBMS revenue grew 13.6% during 2013 — faster than the overall market.

Strengths
  • Microsoft offers appliances, reference architectures including a variety of hardware, prebuilt offerings built to customer selections then delivered ready to run, software licensing and managed services data warehouses.
  • Customers report a low count of software issues, above-average customer experience and obvious interoperability with Excel (and Office).They also like the easy-to-understand licensing and pricing — adding to execution.
  • Customers are predominantly on the current release, and almost 60% of customers report it is their data warehouse standard. Microsoft has taken steps in pursuing the LDW with HDInsight (HDP for Windows), PolyBase and Microsoft Cloud (Windows Azure Infrastructure Services can be used to deploy a data warehouse).
Cautions
  • Microsoft is catching up with the other leaders, but a fast-follower market demand still drives the Microsoft road map. However, Microsoft has demonstrated its willingness to be aggressive in certain areas (such as unstructured data via SharePoint search and Azure).
  • Organizations report large volumes of data but, in general, Microsoft data warehouses have a small number of users — better examples of scaling warehouses are needed. Customers want easier access to usable metadata for heterogeneous environments.
  • Reference customers still report a significant cost advantage, but inquiries indicate that even small price increases do matter and Microsoft needs to maintain its price differentiation from other vendors.

Oracle

Oracle (www.oracle.com) offers a range of products. Customers can choose to build a custom warehouse, a certified configuration of Oracle products on Oracle-recommended hardware or on appliances. Oracle has over 390,000 DBMS customers worldwide. We estimate about 4,000 Exadata appliances have been sold.

Strengths
  • Oracle continues to be the relational DBMS market share leader (Gartner estimates over 42% of the market in 2013), and good execution in the data warehouse market. Customers mention Oracle's strong overall viability.
  • Oracle has clarified its LDW message and the role of its Big Data Appliance, and improvements were deployed for administrative support and instance consolidation (for example, pluggable databases). Its LDW adoption rate is equal to the market average.
  • Oracle is consistently shortlisted in Gartner data warehouse competitive inquiries. In 2013, however, we saw an increase in the number of customers introducing competitors during scaling challenges (as the warehouse grows), but most decided to stay with Oracle.
Cautions
  • Oracle has announced in-memory columnar capability (at Oracle Open World in 2013), but this has yet to be delivered.
  • A third of the reference customers surveyed indicated pricing and licensing as being an issue. Oracle scored low in the survey on perceived value for cost.
  • An overall low rating for customer experience does not seem to affect the customers' intention to buy more from this vendor (two-thirds of survey respondents indicated current plans to buy more from Oracle).

Pivotal (Greenplum)

Pivotal (www.gopivotal.com) became an independent entity on 1 April 2013. The new organization carries assets of EMC (that is, Pivotal Labs and Greenplum, including Pivotal HD) and VMware (vFabric, Cloud Foundry, GemStone, GemFire, SQLFire and Cetas). Additionally, EMC, VMware and GE are investors. Greenplum DB and Pivotal HD are addressed in this analysis.

Strengths
  • Pivotal has a strong vision to integrate its products to form the Pivotal Data Platform — supporting operational use cases combined with analytics (with HDFS) for common persistence.
  • Pivotal has addressed the requirements of big data with specific investments in data virtualization across its data fabric, dedicated metadata management across its stack, and investment in high-end analytics such as MADlib or natural-language processing. Its funding model supports its ambitious R&D plans.
  • Customers report overwhelmingly that speed is the No. 1 benefit from Pivotal and that they utilize it extensively for complex analysis when combining diverse and large datasets across the range of information types.
Cautions
  • Pivotal's objectives are ambitious and may be ahead of the overall market demand. It may also be ahead of its existing customer base. Its strong vision may be ahead of revenue opportunities.
  • There is confusion about the overall positioning of Pivotal in the enterprise data warehouse market. Pivotal's strong vision for future trends causes a perception issue in the traditional data warehouse space.
  • In the reference survey, clients brought up challenges in ease of deployment and administration, which were occasionally complicated by customer support challenges that hampered their execution.

SAP

SAP (www.sap.com) offers both SAP Sybase IQ and SAP Hana. SAP Sybase IQ, was the first column-store DBMS. It is available as a stand-alone DBMS, a data warehouse appliance and on an OEM basis via system integrators. SAP Sybase IQ has more than 2,000 customers worldwide. SAP Hana became generally available in June 2011, and we estimate it has more than 2,000 customers.

Strengths
  • SAP offers a true IMDBMS solution that addresses traditional issues such as managing dual inputs, updates and deletions in separate row, column or hybrid systems. There is significant opportunity to reduce complexity, because the IMDBMS reduces the need to duplicate data between transactional and analytics systems.
  • SAP had shown 38% growth in the DBMS market during 2013, primarily from Hana. Gartner inquiries and the survey on big data adoption show that customers consider Hana to be a viable part of their big data solution.
  • SAP Hana and Sybase IQ combined are the foundation for LDW implementations — also leveraging the complete suite of products such as data services or replication server. SAP Hana has grown in maturity with stronger high availability disaster recovery capabilities.
Cautions
  • Sybase IQ growth appears to be shrinking. While Sybase IQ and Hana in combination can support the LDW, more traditional data warehouse customers are becoming concerned about the continued improvement of Sybase IQ — despite SAP assurances regarding the importance of its role.
  • The SAP vision for data warehousing and analytics is to focus on an IMDBMS, with combined transactional and analytics data management in a single platform. SAP is fully committed to its vision; however, its customers are struggling to grasp this new context for a data warehouse. Many SAP customers will continue to also deploy SAP's competitors, so SAP must answer with customer education and continued positive experiences to increase customer confidence in its vision.
  • Client inquiries regarding SAP's data warehouse pricing options are increasing. SAP's no-discount strategy for SAP Hana is continuing to surprise customers used to high discount rates. As a result, SAP has developed ROI calculators for customers — to demonstrate the value of Hana. Success in the SAP Hana market demonstrates that clients are responding to these initiatives — overcoming pricing challenges and buying the technology.

Teradata

Teradata (www.teradata.com) has more than 30 years of history in the data warehouse market. It offers a combination of tuned hardware and analytics-specific database software, which includes the Teradata database (on various appliance form factors) and the Aster Database (as a DBMS, an appliance or via the cloud). It offers traditional and LDW solutions and reports over 1,200 customers, almost exclusively data warehouses.

Strengths
  • Teradata continues to demonstrate its consistent ability to deliver on market trends, reliably meeting customer demand; for example, the Teradata Intelligent Memory option (Teradata's IMDBMS) was delivered in 2013.
  • The customer base continues to value Teradata and invest in its technology. The company sells in the traditional data warehouse market while also innovating in the broader market.
  • Teradata continues to further support the LDW with Unified Data Architecture, AsterData and Hadoop (all also offered in appliances), and continues to invest in multistructured formats such as JavaScript Object Notation (JSON) or XML.
Cautions
  • Teradata continues to be the "point of reference" vendor in data warehousing, but faces consistent market pressure from existing as well as new entrants.
  • Teradata will be pressured by large competitors that are starting to allow transactional and analytical processing on the same instance of data.
  • Reference clients cited the visibility of costs as a concern, affecting justification during purchasing approval processes. In 2013, Teradata stabilized overall customer experience concerns that had emerged during 2012.

Vendors Added and Dropped

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor's appearance in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. It may be a reflection of a change in the market and, therefore, changed evaluation criteria, or of a change of focus by that vendor.

Added

  • Amazon Web Services (specifically, Redshift and Elastic MapReduce)
  • Cloudera
  • MarkLogic
  • Pivotal (Greenplum)

Dropped

  • EMC — Has created the Pivotal entity and Pivotal (Greenplum) now appears on the Magic Quadrant.
  • ParAccel — This vendor was acquired by Actian in 2013.

Other Vendors to Consider

Gartner's Magic Quadrant process includes research on a wider range of vendors than appears in the published document. In addition to the vendors featured in this Magic Quadrant, Gartner clients sometimes consider the following vendors when their specific capabilities match the deployment needs (this list also includes recent market entrants with relevant capabilities). These vendors were not included in the Magic Quadrant because they failed to meet one or more of the inclusion criteria. Unless otherwise noted, the information provided on these vendors derives from responses to Gartner's initial request for information for this document or from reference survey respondents.

This list is not intended to be comprehensive:

  • 10Gen (MongoDB). MongoDB is an open-source NoSQL "document" database, using memory-mapped files to enhance read performance, which operates on commodity-class hardware. AWS, IBM and Rackspace offer preconfigured systems. Data integration tools such as Informatica, Pentaho and Talend offer native integration support. MongoLab and MongoHQ (cloud-hosted database as a service) provide professional services to support implementations for subscribers and include various features and functions as a value-add. Customers include archive duties for Craigslist and content management roles at MTV. MongoDB engineers host office hours in New York, San Francisco, Palo Alto and Atlanta, in the U.S.; London, in the U.K.; and Dublin, in Ireland.
  • BMMsoft. The BMMsoft EDMT solution includes the SAP Sybase IQ database as its data management base system and, because BMMsoft is an OEM supplier, it is not considered a stand-alone DBMS. Nevertheless, prospective clients should note that the product has unstructured data, content, email analytics and other capabilities in addition to the DBMS. BMMsoft is based in San Francisco, California, U.S.
  • Hitachi. The Hitachi Advanced Data Binder Platform (HADB) was first available in June 2012, and is an analytical appliance based on Hitachi storage and servers. It is mainly focused on the Japanese market with four production customers and more than 10 customers conducting proofs of concept — all in Asia/Pacific — as of November 2013. As a hardware and software provider, Hitachi offers HADB as a software solution, certified configuration or appliance. Initial customer reference calls report high performance, ease of implementation and high suitability for purpose. Hitachi participates in TPC-H benchmarks, with results in the 100TB category. (Note: Gartner does not report on or endorse TPC results, but does utilize them as an additional input to our research.)
  • Hortonworks. Located in Palo Alto, California, U.S., and founded in 2011, Hortonworks markets the Hortonworks Data Platform (HDP), derived entirely from the open-source Apache Hadoop stack. The company has taken a leading role in the development of Apache Hadoop, including facilitating improvements to fundamental query and processing components. To this end, Hortonworks has leveraged Yarn (cluster resource manager for Hadoop 2.0) to enable Hadoop to support data processing applications beyond batch-only operations. Hortonworks is participating in the "Stinger" initiative to advance Apache Hive for interactive query capabilities.
  • MapR Technologies. Located in San Jose, California and founded in 2009, MapR Technologies offers a Hadoop distribution with storage optimizations, high availability improvements, and administrative and management tools, and uses Network File System (NFS) instead of HDFS. The company's recent M7 release eliminates region servers (replacing them with automated region splits) and disaster recovery (data assurance and process recovery). MapR also markets a robust version of Apache HBase, the table-style NoSQL database built on several components from the Apache Hadoop stack. The company has a rich partner ecosystem and provides its products through multiple cloud infrastructure firms as well as on-premises. It offers training and education services.
  • Objectivity. This vendor, which has a lineage dating back to 1989 when it first began to offer an object-oriented database, now offers InfiniteGraph and Objectivity/DB (version 10.2.1). Objectivity reports a global sales presence with thousands of direct licensed customers as well as thousands of embedded licenses worldwide. Customers give it a mixed review: praising capabilities such as multiple versions of objects and high level SQL capabilities; but also indicating that nonspecific SQL functionality that should be present is lacking, resulting in a demand for specialized, product-specific skills. In 2014, Objectivity is focused on releasing additional product versions to address customer-requested features, functionality and performance. Objectivity is based in Sunnyvale, California, U.S.
  • ParStream. ParStream is a columnar, in-memory database offering a high performance compression index on an MPP architecture. Version 3.0, released in 2014, provided broad support for SQL Joins and highly distributed query processing (see "Cool Vendors in In-Memory Computing, 2013"). ParStream achieved $15.6 million in funding, and a recently established partnership with QlikView provides visual data analysis for large datasets in combination with time series data. However, ParStream did not have enough customers that were prepared to share information with Gartner about their usage of ParStream for this research. The company is based in Cologne, Germany and Cupertino, California, U.S.
  • RainStor. RainStor 5.5 was released in June 2013, with its first general availability release in June 2008. The product can be deployed on-premises or in the cloud and most of its more than 100 customers report use cases as a near-line, fully integratable data archive. It is integrated with the Teradata data warehouse, capable of moving data bidirectionally between the two environments. However, RainStor also provides full DBMS capability in a highly compressed file format that can hold multiple data types. Customer solutions in production include analytical and compliance archives running non-Hadoop and Hadoop distributions from Cloudera, Hortonworks, IBM and MapReduce; certified configurations via Dell, EMC and other hardware platforms; and, it is also part of an offering for data retention and analytics — from HP. RainStor's technology appears viable to support the LDW as a primary component. It is based in San Francisco, California, U.S.
  • XtremeData. This privately owned company targets organizations that need a massively scalable DBMS solution for mixed read and write workloads in the cloud — both public and private. Their dbX database is a multithreaded solution that scales to efficiently use all available processor capacity. Organizations benefit from low entry costs, fast time to market, elastic scalability, and a pay-for-use billing model. XtremeData is available on AWS and other clouds. Our information on this vendor was obtained from reference customers, from previous direct communications with XtremeData and the ongoing research of Gartner analysts. XtremeData is based in Schaumburg, Illinois, U.S.

Inclusion and Exclusion Criteria

To be included in this Magic Quadrant vendors had to meet the following criteria:

  • Vendors must have DBMS software that has been generally available for licensing or supported download for at least a year (since 10 December 2012, so throughout 2013).
    • We use the most recent release of the software to evaluate each vendor's current technical capabilities. We do not consider beta releases. For existing data warehouses, and direct vendor customer references and reference survey responses, all versions currently used in production are considered. For older versions, we consider whether later releases may have addressed reported issues, but also the rate at which customers refuse to move to newer versions.
    • Product evaluations include technical capabilities, features and functionality present in the product or supported for download through 8:00 p.m. U.S. Eastern Daylight Time on 5 December 2013. Capabilities, product features or functionality released after this date can be included at Gartner's discretion and in a manner Gartner deems appropriate to ensure the quality of our research product on behalf of our nonvendor clients. We also consider how such later releases can reasonably impact the end-user experience.
  • Vendors must have generated revenue from at least 10 verifiable and distinct organizations with data warehouse DBMSs in production that responded to Gartner's approved reference survey questionnaire. Revenue can be from licenses, support and/or maintenance. Gartner may include additional vendors based on undisclosed references in cases of known use for classified but unspecified use cases. For this year's Magic Quadrant, the approved questionnaire was produced in English only.
    • Customers in production must have deployed data warehouses that integrate data from at least two operational source systems for more than one end-user community (such as separate business lines or differing levels of analytics).
  • Support for the included data warehouse DBMS product(s) must be available from the vendor. We also consider products from vendors that control or participate in the engineering of open-source DBMSs and their support. We also include the capability of vendors to coordinate data management and processing from additional sources beyond the DBMS, but continue to require that a DBMS meets Gartner's definition (see Notes 1 to 5 for defining parameters).
  • Vendors participating in the data warehouse DBMS market must demonstrate their ability to deliver the necessary services to support a data warehouse via the establishment and delivery of support processes, professional services and/or committed resources and budget.
  • Products that exclusively support an integrated front-end tool that reads only from the paired data management system do not qualify for this Magic Quadrant.

For details of our research methodology, see Note 6.

Evaluation Criteria

Ability to Execute

Ability to Execute is primarily concerned with the ability and maturity of the product and the vendor. Criteria under this heading also consider the product's portability, its ability to run and scale in different operating environments (giving the customer a range of options), and the plurality of viable offerings answering diverse market demands. Ability to Execute criteria are critical to customers' satisfaction and success with a product, therefore customer references are weighted heavily throughout.

Product/service evaluation criterion represent two sets of market demands — ongoing traditional and emerging demand. The largest and most traditional portion of the analytics and data warehouse market is still dominated by the demand to support relational analytical queries over normalized and dimensional models (including simple trend lines through complex dimensional models). From the execution perspective, emerging demand for a LDW has lower emphasis than it does from the vision perspective. Data warehouses are increasingly expected to include repositories, data virtualization and distributed processing. In all cases, the technical attributes of the DBMS, as well as features and functionality built specifically to manage the DBMS when used as a data warehouse platform, are evaluated. Support and management of mixed workloads, high availability/disaster recovery, the speed and scalability of data loading, and support for new hardware and memory models are also included. We also consider the automated management and resources necessary to manage a data warehouse, especially as it scales to accommodate larger and more complex workloads. We compare the delivery of announced product road map to persistent demand for new functionality as represented in our overall client base. The use of new storage and hardware models is crucial to this criterion. Users expect a DBMS to become self-tuning, reducing the resources required to optimize the data warehouse, especially as mixed workloads increase.

Overall viability includes corporate aspects such as the skills of the personnel, financial stability, R&D investment, the overall management of an organization and the expected persistence of a technology during merger and acquisition activity. It also covers the company's ability to survive market difficulties (crucial for long-term survival). Vendors are further evaluated on their capability to establish dominance in meeting one or many discrete market demands.

Under sales execution/pricing we examine the price/performance and pricing models of the DBMS, and the ability of the sales force to manage accounts (judged by the feedback from our clients and feedback collected through the reference survey). We also consider the market share of DBMS software. Also included is the diversity and innovative nature of packaging and pricing models, including the ability to promote, sell and support the product within target markets and around the world. Aspects such as vertical-market sales teams and specific vertical-market data models are considered for this criterion.

Market responsiveness and track record is based upon the concept that market demands change over time and track records are established over the lifetime of a provider. The availability of new products, services or licensing in response to more recent market demands and the ability to recognize meaningful trends early in the adoption cycle are particularly important. The diversity of delivery models as demanded by the market is also considered an important part of this criterion (for example, its ability to offer appliances, software solutions, data warehouse as a service offerings or certified configurations).

Marketing execution includes the ability to generate and develop leads, channel development through Internet-enabled trial software delivery, partnering agreements (including co-seller, co-marketing and co-lead management arrangements). Also considered are the vendor's coordination and delivery of education and marketing events throughout the world and across vertical markets, as well as increasing or decreasing participation in competitive situations.

The customer experience criterion is based primarily on customer reference surveys and discussions with users of Gartner's inquiry service during the previous six quarters. Also considered are the vendor's track record on proofs of concept, customers' perceptions of the product, and customers' loyalty to the vendor (this reflects their tolerance of its practices and can indicate their level of satisfaction). This criterion is sensitive to year-to-year fluctuations, based on customer experience surveys. Additionally, customer input regarding the application of products to limited use cases can be significant, depending on the success or failure of the vendor's approach in the market.

Operations covers the alignment of the vendor's operations, as well as whether and how this enhances its ability to deliver. Aspects considered include field delivery of appliances, manufacturing (including the identification of diverse geographic cost advantages), internationalization of the product (in light of both technical and legal requirements) and adequate staffing. This criterion considers a vendor's ability to support clients throughout the world, around the clock and in many languages. Anticipation of regional and global economic conditions is also considered.

Table 1. Ability to Execute Evaluation Criteria

Evaluation Criteria

Weighting

Product or Service

High

Overall Viability

Low

Sales Execution/Pricing

Medium

Market Responsiveness/Record

High

Marketing Execution

Medium

Customer Experience

High

Operations

Low

Source: Gartner (March 2014)

Completeness of Vision

Completeness of Vision encompasses a vendor's ability to understand the functions needed to develop a product strategy that meets the market's requirements, comprehends overall market trends, and influences or leads the market when necessary. A visionary leadership role is necessary for the long-term viability of both product and company. A vendor's vision is enhanced by its willingness to extend its influence throughout the market by working with independent third-party application software vendors that deliver data-warehouse-driven solutions (for BI, for example). A successful vendor will be able not only to understand the competitive landscape of data warehouses, but also to shape the future of this field with the appropriate focus of its resources for future product development.

Market understanding covers a vendor's ability to understand the market and shape its growth and vision. In addition to examining a vendor's core competencies in this market, we consider awareness of new trends such as the increased demand from end users for mixed data management and access strategies, the growth in data volumes, and the changing concept of the data warehouse and analytics information management (see also "State of Data Warehousing in 2013 and Beyond" and "The Future of Data Management for Analytics Is the Logical Data Warehouse").

Marketing strategy refers to a vendor's marketing messages, product focus, and ability to choose appropriate target markets and third-party software vendor partnerships to enhance the marketability of its products. For example, we consider whether the vendor encourages and supports independent software vendors in its efforts to support the DBMS in native mode (via, for instance, co-marketing or co-advertising with "value-added" partners). This criterion includes the vendor's responses to the market trends identified above and any offers of alternative solutions in its marketing materials and plans.

Sales strategy is an important criterion. It encompasses all channels and partnerships developed to assist with selling, and is especially important for younger organizations as it can enable them to greatly increase their market presence while maintaining lower sales costs (for example, through co-selling or joint advertising). This criterion also covers a vendor's ability to communicate its vision to its field organization and, therefore, to clients and prospective customers. Also included are pricing innovations and strategies, such as new licensing arrangements and the availability of freeware and trial software.

Offering (product) strategy covers the areas of product portability and packaging. Vendors should demonstrate a diverse strategy that enables customers to choose what they need to build a complete data warehouse solution. Also covered are partners' offerings that include technical, marketing, sales and support integration.

Business model covers how a vendor's model of a target market combines with its products and pricing, and whether the vendor can generate profits with this model — judging by its packaging and offerings. Additionally, we consider reviews of publicly announced earnings and forward-looking statements relating to an intended market focus. For private companies, and to augment publicly available information, we use proxies for earnings and new customer growth — such as the number of Gartner clients indicating interest in, or awareness of, a vendor's products during calls to our inquiry service.

Organizations continue to seek traditional data warehousing solutions with a vertical/industry strategy. This strategy affects a vendor's ability to understand its clients. A measurable level of influence within end-user communities and certification by vertical industry standards bodies are of importance here.

Innovation is a major criterion when evaluating the vision of data warehouse DBMS vendors in developing new functionality, allocating R&D spending and leading the market in new directions. This criterion also covers a vendor's ability to innovate and develop new functionality in its DBMS, specifically for data warehouses. Also addressed here is the maturation of alternative delivery methods, such as infrastructure as a service and cloud infrastructures as well as solutions for hybrid premises-cloud and cloud-to-cloud data management support. Vendors' awareness of new data warehousing methodologies and delivery trends is also considered. Emerging strategies play an important part in this criterion, with growing demands for blending schema-on-write with schema-on-read solutions (for example, relational data with NoSQL) as well as the ability to directly access operational data via the data warehouse platform. Organizations are increasingly demanding data storage strategies that balance cost with performance optimization, so solutions that address aging and temperature of data will become increasingly important.

We evaluate a vendor's worldwide reach and geographic strategy by considering its ability to address customer demands in different global regions using its own resources or in combination with subsidiaries and partners.

Table 2. Completeness of Vision Evaluation Criteria

Evaluation Criteria

Weighting

Market Understanding

High

Marketing Strategy

Medium

Sales Strategy

Medium

Offering (Product) Strategy

Medium

Business Model

Low

Vertical/Industry Strategy

Low

Innovation

High

Geographic Strategy

Medium

Source: Gartner (March 2014)

Quadrant Descriptions

Leaders

The Leaders are finding new ways to disrupt each other, and their competition with each other is important. Some are focusing on following trends once they are established, while others are attempting to set the trends and thus the standard for others to follow.

The LDW, which is evolving into a new analytics data management platform, is being pursued by best-of-breed approaches and stack vendors alike. Leaders can participate in both approaches. Big data is becoming normal, and Leaders have also addressed the big data challenge — usually focusing on social and machine data — but do not limit themselves to specific sources of data. Because the data warehouse market is large in terms of revenue, high-stakes decisions are being made by all of them regarding road maps and market delivery. The emphasis for Leaders is to retain existing traditional customers and help them grow existing warehouses, while expanding their engagement by introducing big data and cloud capabilities and making sure their marketing strategy and messaging creates confidence for both the traditional and emergent ends of the market.

Gartner estimates that (as of the end of 2013) more than 85% of data warehouse demand is still highly traditional. While adding big data to the warehouse is greatly desired and has been pursued and achieved (especially in leading organizations), combining traditional warehousing with big data and/or putting the warehouse in the cloud is more significant in the leading 15% of the market. Thus, traditional data warehouse leaders are executing extremely well in the largest part of the market, while also pushing their own innovations forward.

The current market conditions are a direct result of Gartner's Nexus of Forces (that is, mobility, cloud, social and information), which is driving traditional solutions to expand and allowing the entry of new solutions. As the Leaders respond with new approaches, new messages and new features/functions, so the gap between the Leaders and everyone else is very wide in 2014. This gap represents the Leaders' capacity and commitment in responding to challenges to the status quo from new entrants, while delivering effectively for traditional customers who simply demand efficiency and robust operations.

Revenue alone does not determine a Leader, the almost 30 years of data warehouse experience (from before 1989 when the phrase was popularized) has taught the market that there are physical laws to the data universe. Engineering technology advances need to anticipate when those laws will threaten current performance and cost models. All of the Leaders continue to demonstrate this level of anticipation and flexibility and this has opened the gap between this quadrant and all others with specific regard to execution.

Challengers

During 2013 the Challengers were focused on delivering against highly specific demands, but all of them also demonstrated their own approach to delivery. The Challengers quadrant is composed primarily of alternative approaches to delivering a more traditional style of data warehouse and dealing with analytics data management issues. This year, clouds and comprehensive analytics stacks have gained in emphasis. Importantly, the Challengers generally have adequate vision for their market approach and how to expand their penetration.

The Challengers quadrant includes stable vendors with strong, established offerings. but an often singular vision. In 2013, it was possible for very strong traditional vendors to have high scores for Ability to Execute, but lower scores for Completeness of Vision. Challengers have presence in the data warehouse DBMS space, proven products and demonstrable corporate stability. They generally have a highly capable execution model. Ease of implementation, clarity of message and engagement with clients contribute to the success of these vendors.

Visionaries

To qualify as Visionaries, vendors must demonstrate that they have customers in production — in order to prove the value of their functionality and/or architecture. Our requirements for production customers and general availability for at least a year mean that Visionaries must be more than just startups with a good idea. Frequently, Visionaries will drive other vendors and products in this market toward new concepts and engineering enhancements.

It is not correct to think of Visionaries as simply smaller companies in 2014, because they have often demonstrated (even) global capabilities. The demand for traditional solutions contrasts significantly with those offered by our Visionaries this year. Visionaries in 2014 will push the market in new directions, but will sometimes suffer a lower appeal in the traditional space.

Niche Players

The Niche providers in 2013 saw new entrants challenge them and, in some cases, moved swiftly past them. However, the overall broadening of vision in the market, with competing approaches, is actually creating openings for new vendors. Niche Players generally deliver a highly specialized product with limited market appeal. Frequently, a Niche Player provides an exceptional data warehouse DBMS product, but is isolated or limited to a specific end-user community, region or industry. Although (sometimes) the solution itself may have no limitations, adoption is limited.

This quadrant contains vendors in several categories:

  • Those with data warehouse DBMS products that lack a strong or a large customer base.
  • Those with a data warehouse DBMS that lacks the functionality of those of the Leaders.
  • Those with new data warehouse DBMS products that lack general customer acceptance or the proven functionality to move beyond niche status. Niche Players typically offer smaller, specialized solutions that are used for specific data warehouse applications, depending on the client's needs.

Context

In 2014, traditional data warehouse vendors continue to face the challenge emerging from new processing techniques such as MapReduce/Hadoop distributions. These new techniques are often referred to as "big data solutions" in the popular press, and the continued hype regarding new approaches as replacements for traditional solutions has continued. As a response, the multi-billion-dollar vendors brought new capabilities and functionalities to their offerings forward, from R&D efforts that were actually underway for as long as the past four years.

Before data warehouse appliances, early data warehouses were deployed on backup servers and supported differently because they were deemed to be non-mission-critical (even as late as the early 2000s). These early warehouses were highly dependent upon personnel and skills. As data warehouses became important to organizations, they were moved to mission-critical status and the demand for more robust systems emerged — including appliances.

During 2013, newly emergent vendors promoting distributed processing deployed on commodity-class hardware clusters were faced with the challenge of making their solutions of enterprise grade. Early adopters of MapReduce/Hadoop did deploy very large clusters, but manually maintained them without the assistance of administrative tools and interfaces. Implementing organizations began to realize the true cost of these manually managed and maintained clusters and it is now becoming clear that this shift was really a return to the previous personnel- and skills-based model — and has sustainability issues.

As we progress toward 2016, this new generation of analytics developers and their managers will realize their implementations need advanced tools for production management and maintenance. Right now, market solutions are little more than a change in how the total cost of ownership is distributed — from tools to personnel. As the new offerings introduce hardware management, job control and development tools, users will drive big data-only solutions into the Trough of Disillusionment on Gartner's Hype Cycle and this trough will act as a "forge" that burns away inefficient offerings.

Over the next two to three years, the market will witness the fruition of the struggle (previously described in the "The State of Data Warehousing in 2011") in which the industry will see all sorts of competing architectures and strategies for addressing data management for analytics. Acquisitions and business failures among new vendors will result in two or three emerging as viable companies in analytics data management.

The contenders will include newly adapted traditional vendors that have acquired or deployed new capabilities on their mature architectures, maturing new vendors (few of which will actually survive intact past 2016) and wide-area network providers (such as Cisco) that will pioneer new approaches for the efficient management of both data management and processing over wide, geographically distributed information assets.

Market Overview

In 2013, certain observations were readily apparent from direct inquiries with Gartner clients as well as through vendor references.

  • Any traditional expansion of existing warehouses was called into question during upgrade cycles, scale-out demands for new analytics, or simply scale-up requirements as the warehouse grew. This created delays while honest debate raged in organizations regarding the "go forward" strategy. Ultimately, however, warehouse managers and architects determined that optimizing the existing traditional warehouse was the best immediate strategy.
  • Distributed processing on commodity clusters (that is, Hadoop, NoSQL, NewSQL) created more confusion than revenue, with only a fraction of revenue actually going toward newly emergent vendors compared with the value of the overall market. Clearly, while some organizations were shifting to experiment with new approaches, others were simply vacillating amid indecision.
  • Traditional vendors accelerated their pace of adopting new approaches and began to take advantage of their mature platforms to introduce workload management, parallelization, efficient update, load, system administration and infrastructure management systems. At the same time, these vendors began to use alternative file systems for data storage — other than just relational — with many adding HDFS as a storage platform. These same traditional vendors have almost all deployed some form of infrastructure or software "as a service" solution to assure their presence in the cloud and begin maturing their cloud-based revenue models.
  • Some new vendors traded the time needed for tools and software capabilities in favor of earlier releases of products supported by professional services. At the same time, others had a slower release of somewhat more mature solutions, but lagged in immediate revenue. Data management for analytics has always had difficulty in finding the proper balance between services and tools/software revenue. With the complication of cloud solutions, be it infrastructure as a service, platform as a service or SaaS, this is even more difficult.
  • End users have been determined to leverage specific technology advances to accelerate the performance of existing systems, as well as to leverage them for new deployments or redeployments (for example, in-memory).
  • There is a nascent, embryonic demand for hybrids of analytics and transactions in a single data instance approach, often epitomized by an in-memory technology. While not yet a driver in the data warehouse or analytics data management markets, this hybrid database capability will start to exert its influence — first in applications and then through trickle-down into analytics during the next 10 years.

Gartner anticipates this confusion will continue well into the first half of 2014. At that time, the market will begin to strike a balance between the following forces.

  1. Offloading of historical data will demand lower-priced storage tiers, but will insist that historical data can be accessed for in-depth, longer-term analysis. This will be one role of the new technologies and will retard the demand for bigger data warehouses based solely on storage.
  2. The demand for preprocessing of big datasets will demand architectures that combine data integration, data management for analytics and high-volume batch analysis. Traditional vendors will respond by enhancing their runtime management capabilities to combine disparate engineering approaches, either organically or through acquisition. This will increase the demand on traditional vendors for superior processing management and bigger processing capacity.
  3. New vendors will be pressured by their investors to achieve advantageous exits and ROI — increasing the pressure on private companies to sell. Their management teams will need to provide adequate justification for continued investment or for achieving higher revenue and margins.
  4. The demand for new data in analytics, and new combinations of data, will drive the organic growth of existing solutions and architectures.
  5. The taste for commodity hardware clusters, which provide a low first cost but are supported by long-term skills development, will persist — but will move into retail and healthcare providers and become stronger in government.
  6. Cloud data warehousing options are becoming viable alternatives for organizations, especially for greenfield implementations. They will lower barriers to entry, but do not necessarily guarantee cost savings.

Taken individually, each of these forces appears to drive the market in one direction or the other. Taken together, the traditional vendors will begin acquiring new technologies — providing relief to investors and wider capabilities to their customers. The warehouse will change into a new and better form that is a management environment for integrating all data types and providing adequate processor management to perform different types of processing in parallel.

Traditional vendors that have professional services units will form specialized delivery teams that capitalize on building out low-cost commodity clusters that are integrated with smaller, but higher value solutions — based on their existing platforms. Finally, any new vendors that cross the line from commodity deployment to higher-priced solutions (most likely as reference architectures with included services) will see slow adoption as they lose the appeal of being low-cost alternatives under a high-cost-of-delivery model, and will be relegated to niche status in the market.

Acronym Key and Glossary Terms

AWS Amazon Web Services
BI business intelligence
HDFS Hadoop Distributed File System
IMDBMS in-memory database management system
IP intellectual property
LDW logical data warehouse
MPP massively parallel processing

Note 1
Logical Data Warehouse Definition

The LDW is a new data management architecture for analytics combining the strengths of traditional repository warehouses with alternative data management and access strategies. It has seven major components:

  • Repository management
  • Data virtualization
  • Distributed processes
  • SLA management
  • Auditing statistics and performance evaluation services
  • Taxonomy and ontology resolution
  • Metadata management

Note 2
Expected Workloads

For the purposes of this evaluation, the workloads we expect to be managed by a data warehouse include batch/bulk loading, structured query support for reporting, views/cubes/dimensional-model maintenance to support online analytical processing, real-time or continuous data loading, data mining, significant numbers of concurrent access instances (primarily to support application-based queries at the rate of hundreds if not thousands per minute) and management of externally distributed processes.

Note 3
Gartner's Definition of a Data Warehouse

A data warehouse is a collection of data in which two or more disparate data sources can be brought together in an integrated, time-variant information management strategy. Its logical design includes the flexibility to introduce additional disparate data without significant modification of any existing entity's design.

A data warehouse can be much larger than the volume of data stored in the DBMS, especially in cases of distributed data management. Gartner clients report that 100TB warehouses often hold less than 30 terabytes of actual data (that is, SSED).

Note 4
Gartner's Definition of a Data Warehouse Appliance

A data warehouse appliance consists of a DBMS mounted on specified server hardware with an included storage subsystem specifically configured for analytics-use-case performance characteristics. In addition, a single point of contact for support of the appliance is available from the vendor, and the pricing for the appliance does not include separate prices for the hardware, storage and DBMS components.

Note 5
Data Warehouse Data Volumes

The data warehouse volume managed in the DBMS can be of any size. For the purpose of measuring the size of a data warehouse database, we define data volume as SSED, excluding all data-warehouse-design-specific structures (such as indexes, cubes, stars and summary tables). SSED is the actual row/byte count of data extracted from all sources. The sizing definitions of traditional warehouses are:

  • Small data warehouse — less than 5TB
  • Midsize data warehouse — 5TB to 40TB
  • Large data warehouse — more than 40TB

Note 6
Research Methodology for This Update

Gartner uses multiple inputs to establish the positions and scoring of vendors in our Magic Quadrants. These are adjusted to account for maturity in a given market, market size and other factors. For this update of the Magic Quadrant, the following sources of information were used:

  • Original Gartner published research, often utilizing our market share forecasts to establish the breadth and size of a market.
  • Publicly available data, such as earnings statements, partnership announcements, product announcements and published customer cases.
  • Gartner inquiry data collected from over 16,000 inquiries conducted by the authors and the wider analyst community within Gartner over the previous 20 months: Inputs such as use cases, issues encountered, license and support pricing, and implementation plans.
  • RFI surveys issued to the vendors, in which they were asked to provide specifics about versions, release dates, customer counts, distribution of customers worldwide and other data points. Vendors could refuse to provide any information in this survey, at their discretion.
  • Customer reference surveys (with almost 300 new responses added this year to four prior years of survey data). Vendors were asked to identify a minimum number of references. Gartner augmented the vendor-provided population by adding Gartner inquiry client contacts as potential respondents. Responses were voluntary for all participants. These surveys included questions to validate customers as current license holders. Additionally (especially in the case of open-source utilization), customers provided information that validated the size and scope of their implementations. In addition, customers were asked to provide information about issues and software bugs, overall and specific sentiments about their experience of the vendor, the use of other software tools in the environment, the types of data involved, and the rate of data refresh or load. They were also asked about their deployment plans. Historical survey response were used to identify trending only. Current-year survey responses were used for all commentary.
  • Gartner customer engagements, in which we provide specific support, were aggregated and anonymized to add additional perspective to the other, more expansive research approaches.

It is important to note that this is qualitative research and, as such, forms a cumulative base on which to form the opinions expressed in this Magic Quadrant.

Evaluation Criteria Definitions

Ability to Execute

Product/Service: Core goods and services offered by the vendor for the defined market. This includes current product/service capabilities, quality, feature sets, skills and so on, whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.

Overall Viability: Viability includes an assessment of the overall organization's financial health, the financial and practical success of the business unit, and the likelihood that the individual business unit will continue investing in the product, will continue offering the product and will advance the state of the art within the organization's portfolio of products.

Sales Execution/Pricing: The vendor's capabilities in all presales activities and the structure that supports them. This includes deal management, pricing and negotiation, presales support, and the overall effectiveness of the sales channel.

Market Responsiveness/Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor's history of responsiveness.

Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization's message to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This "mind share" can be driven by a combination of publicity, promotional initiatives, thought leadership, word of mouth and sales activities.

Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements and so on.

Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure, including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.

Completeness of Vision

Market Understanding: Ability of the vendor to understand buyers' wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen to and understand buyers' wants and needs, and can shape or enhance those with their added vision.

Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.

Sales Strategy: The strategy for selling products that uses the appropriate network of direct and indirect sales, marketing, service, and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.

Offering (Product) Strategy: The vendor's approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature sets as they map to current and future requirements.

Business Model: The soundness and logic of the vendor's underlying business proposition.

Vertical/Industry Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including vertical markets.

Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes.

Geographic Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the "home" or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.