Trend No. 4: Graph analytics
Business users are asking increasingly complex questions across structured and unstructured data, often blending data from multiple applications, and increasingly, external data. Analyzing this level of data complexity at scale is not practical, or in some cases possible, using traditional query tools or query languages such as SQL.
The application of graph processing and graph databases will grow at 100% annually
Graph analytics is a set of analytic techniques that shows how entities such as people, places and things are related to each other. Applications of the technology range from fraud detection, traffic route optimization and social network analysis to genome research. Gartner predicts that the application of graph processing and graph databases will grow at 100% annually over the next few years to accelerate data preparation and enable more complex and adaptive data science.
Trend No. 5: Commercial AI and machine learning
Open-source platforms currently dominate artificial intelligence (AI) and machine learning and have been the primary source of innovation in algorithms and development environments. Commercial vendors were slow to respond, but now provide connectors into the open-source ecosystem. They also offer enterprise features necessary to scale AI and ML, such as project and model management, reuse, transparency and integration — capabilities that open-source platforms currently lack. Increased use of commercial AI and ML will help to accelerate the deployment of models in production, which will drive business value from these investments.
Trend No. 6: Data fabric
Deriving value from analytics investments depends on having an agile and trusted data fabric. A data fabric is generally a custom-made design that provides reusable data services, pipelines, semantic tiers or APIs via a combination of data integration approaches in an orchestrated fashion. It enables frictionless access and sharing of data in a distributed data environment.
Trend No. 7: Explainable AI
Explainable AI increases the transparency and trustworthiness of AI solutions and outcomes, reducing regulatory and reputational risk. Explainable AI is the set of capabilities that describes a model, highlights its strengths and weaknesses, predicts its likely behavior and identifies any potential biases. Without acceptable explanation, autogenerated insights or “black-box” approaches to AI can cause concerns about regulation, reputation, accountability and model bias.
Trend No. 8: Blockchain in data and analytics
Blockchain technologies address two challenges in data and analytics. First, blockchain provides the lineage of assets and transactions. Second, it provides transparency for complex networks of participants. However, blockchain is not a stand-alone data store and it has limited data management capabilities. A blockchain-based system can’t serve as a system of record, meaning a huge integration effort involving data, applications and business processes. Realistically, the technology hasn’t yet matured to real-world, production-level scalability for use cases beyond cryptocurrency.
Read more: The Reality of Blockchain
Trend No. 9: Continuous intelligence
Organizations have long sought real-time intelligence, and systems are available to do this for a limited set of tasks. Now it is finally practical to implement these systems — what Gartner calls continuous intelligence — on a much broader scale because of the cloud, advances in streaming software and growth data from sensors in the Internet of Things (IoT). By 2022, more than half of major new business systems will incorporate continuous intelligence that uses real-time context data to improve decisions.
Trend No. 10: Persistent memory servers
Most database management systems (DBMS) make use of in-memory database structures, but with data volumes growing rapidly, memory size can be restrictive. New server workloads are demanding not just faster processor performance, but also massive memory and faster storage. Persistent memory technology will help businesses extract more actionable insights from data. Many DBMS vendors are experimenting with persistent memory, although it may take several years to modify their software to take advantage of it.