The data brokerage market is highly fragmented. Gartner estimates there may be up to 5,000 data brokers worldwide, plus nearly 10 million open datasets published by government agencies and non-governmental organizations (NGOs).
In most countries, this market is not regulated and not homogeneous; rather, it is composed of small, narrowly segmented submarkets.
Gartner predicts extensive consolidation in the data brokerage market by 2020.
It is also a very dynamic market. New companies often enter it, others leave, and mergers and acquisitions occur. Some organizations that are not data brokers per se are recognizing that their own data can bring in revenue, so they are either licensing their raw data or sharing derived insights with their partners, as if they were data brokers.
“All these factors make it challenging for organizations to understand the offerings available in this market,” said Valerie Logan, research director at Gartner, at the Gartner Business Intelligence & Analytics Summit in India this week. “You need to start your evaluation efforts by taking initial steps toward understanding data brokers and what they provide.”
A data broker is a business that aggregates information from a variety of sources; processes it to enrich, cleanse or analyze it; and licenses it to other organizations. They are sometimes also called an information broker, syndicated data broker or information product company.
Gartner predicts that by 2019, 75 percent of analytics solutions will incorporate 10 or more data sources from second-party partners or third-party providers.
A high level taxonomy
Data brokers can be categorized by the type of data provided. For example, it can be consumer data (about individuals such as name and phone number), commercial data (about companies), scientific and technical data (such as weather, natural events or drug studies), real estate information (about homes, farms or commercial properties) or geolocation data (such as global positioning, streets and traffic conditions).
Data brokers tend to specialize in certain industries in order to gain a competitive advantage, and they often target specific functions within organizations, such as sales, finance or human resources.
Brokers can also be categorized by the level of insight provided:
- Simple data services are the most common. Data brokers collect data from multiple sources and offer it in collected and conditioned form — data which would otherwise be fragmented, conflicted and sometimes unreliable.
- Smart data services provide conditioned and calculated data, with analytical rules and calculations applied to derive further insight from the collected data and aid the decision-making process.
- Adaptive data services apply analysis to a customer’s request-specific data combined with data in a context store. This is a more advanced form of service.
Pricing models used by data brokers include:
- Free: Everything is accessible without a fee
- Freemium: Part of the offering is free; part of it you have to pay for
- Paid: Everything is available only after the customer pays for it, either by subscription, pay per use or a combination of these two models
Gartner predicts extensive consolidation in the data brokerage market by 2020. As the market evolves, organizations’ demands will become more sophisticated, which will lead some providers to enhance their quality and cause others to lose market share.
Depending on whether your data initiative is long-term or short-term, either develop a multiyear relationship with your data providers, and check their financial viability and references, or create contingency plans to minimize the impact of a change in provider.
Gartner clients can learn more in the report “Understand the Data Brokerage Market Before Choosing a Provider”.
Analysts discussed data and analytics trends at the Gartner Business Intelligence, Analytics & Information Management Summit 2016 taking place in Mumbai, India this week. You can follow news and updates from the event on Twitter using #GartnerBI.