Analyst(s):Massimo Pezzini, Donald Feinberg, Nigel Rayner, Roxane Edjlali
Hybrid transaction/analytical processing supports digital strategies by "breaking the wall" between transaction processing and analytics. CTOs should identify which HTAP style (in-process or point-of-decision) fits best with their organization's business goals, use cases, risk profile and skills.
The separation between transaction processing and analytics imposed by traditional application architectures does not support the demands of digital business, in particular the need to respond in real time to business moments.
New technologies such as in-memory computing (IMC) offer the promise of enabling true real-time analytics against transactional data — so-called hybrid transaction/analytical processing (HTAP) — but many organizations are challenged to find business value in initial deployments.
Most traditional approaches deploy analytics outside the transactional process, meaning analytics is not guiding and optimizing process execution in real time.
Endorse HTAP architectures to enable dramatic digital innovation and more informed and "in business real time" decision making.
Be aware of the risks of adopting a new paradigm that could have a disruptive impact on established applications.
Focus initially on building knowledge and delivering tactical "quick win" benefits via point-of-decision HTAP-style systems (where distinct transactional and analytical applications share the same data infrastructure).
Venture into the in-process HTAP-style (where transactional processes are guided "in the moment" by real-time analytics) to deliver transformational benefits, but only when you have defined suitable use cases and your IT teams have become familiarized with the relevant design patterns and enabling IMC technologies.
Plan for a gradual uptake of the two HTAP styles for custom applications on the basis of your organization's business requirements, risk profile, and available business and technical skills.
By 2018, at least 75% of HTAP projects will adopt the less disruptive "point of decision" instead of the more powerful "in-process" approach.
Traditional application architectures separated transactional and analytical systems. Transactional applications generated data that was later dumped in a data warehouse for analysis — hopefully before it got stale. Complicated queries didn't slow down order entry at the end of the month or quarter, because the analytical processing was offloaded from the transactional system. This architecture created a "wall" between the operational systems (e.g., ERP, core banking systems, travel reservation systems) designed for efficient transaction processing and the analytical environment (data warehouses, data marts and, most recently, data lakes) designed for efficient reporting and analytics. The wall introduced delays in, and limited accuracy of, decision-making processes, because they could only be based on "after the fact" analysis.
Digital business, and the need to respond to business moments, means that using "after the fact" analysis is no longer adequate. Business moments are transient opportunities that must be exploited in real time. If an organization is unable to recognize and/or respond quickly to a business moment by taking fast and well-informed decisions, then some other organization will, resulting in a missed opportunity (or a new business threat).
Hybrid transaction/analytical processing (HTAP) — at times referred to as hybrid online analytical processing (OLAP)/online transaction processing (OLTP), or hybrid OLTP and analytics — is an emerging application architecture that "breaks the wall" between transaction processing and analytics. It enables more informed and "in business real time" decision making. HTAP is defined as "an application architecture whereby concurrent analytical and transaction processing algorithms share the same data (and data infrastructure)" (see Note 1 and "Hybrid Transaction/Analytical Processing Will Foster Opportunities for Dramatic Business Innovation" ). Therefore, HTAP allows advanced analytics to be run in real time on "in flight" transaction data, providing an architecture that empowers users to respond more effectively to business moments.
HTAP has a potentially strategic and transformational impact by redefining the way some business processes are designed and executed, as advanced real-time analytics (e.g., planning, forecasting and what-if analysis) becomes an integral part of the transactional business process itself, rather than a separate activity performed after the fact.
HTAP can also be used to achieve tactical benefits by providing decision makers with a real-time picture of their established business processes (for example, an instant snapshot of their sales pipeline by geography and product line).
Gartner has identified two different styles of HTAP:
Point-of-decision HTAP, whereby the transaction processing and analytics aspects are respectively segregated into distinct, independently designed applications. This allows advanced analytics to be performed on "live" transactional data, something that is very hard to achieve in traditional architectures.
In-process HTAP, whereby real-time analytics and transaction processing techniques are woven together in the same application to guide and optimize the execution of transactional processes.
Although certain types of HTAP applications can be implemented using traditional data management technologies, HTAP is most effective when supported by IMC technologies, such as in-memory DBMSs (IMDBMSs) and, to a lesser extent, in-memory data grids (IMDGs). By removing the performance and scalability limitations that historically prevented advanced analytics to run concurrently with transaction processing, IMC technologies make it possible to implement even the most sophisticated forms of HTAP (see "Market Guide for In-Memory Computing Technologies" ).
However, HTAP adoption has its challenges: it implies a new way of thinking about applications and processes; it requires the adoption of still unfamiliar IMC technologies; and it may also require radical re-engineering, or even redesign, of existing processes and applications.
This research will help CTOs — as well as other IT leaders in charge of defining real-time analytics strategies (enterprise architecture managers, analytics leader, application development managers and information management leaders) — to identify the key benefits and challenges of the different styles of HTAP and how to use them for new applications to achieve greater business effectiveness.
The research focuses primarily on the key architectural principles and on HTAP for in-house-developed applications. Organizations will also move to HTAP by adopting packaged applications/SaaS based on HTAP principles. Major providers (such as Oracle, SAP and Workday) already have HTAP applications in the market, with more to follow.
CTOs should therefore consider that HTAP adoption via packaged applications may be based on different drivers (for example, vendors' push to adopt new HTAP-based versions of their products, new HTAP-enabled add-ons to established applications, and the emergence of innovative providers that have adopted HTAP architecture) and may follow different paths than those discussed in this research.
In a point-of-decision HTAP architecture, the transaction processing and analytics aspects are respectively segregated into distinct, independently designed applications that do not influence each other. This means that:
The outcome of analytical algorithms does not have a direct impact on the execution of the transactional processes
The transactional processes cannot trigger the execution of analytical algorithms
In practice, the transaction processing and analytical aspects of the HTAP architecture are independently implemented, possibly by different development teams and at different times (see Figure 1).
HTAP = hybrid transaction/analytical processing
Source: Gartner (June 2016)
For example, a sales pipeline analytics application is added to an established ERP application to provide sales managers with real-time visibility into aggregated revenue from opportunities according to sales stages and time periods.
Point-of-decision HTAP architectures can improve business leaders' situation awareness in operations, and provide constantly updated forecasts and simulations of future business outcomes.
Often, the business value of point-of-decision HTAP architectures is tactical, as it adds faster decision making on top of the established business processes, typically at the point when a person needs to make a business decision. Sometimes, however, the business value can be strategic because the impact of the decision itself could be at the strategic level (for example, a pricing change that has a significant positive impact on the profitability of the organization in the current quarter). Point-of-decision HTAP can provide real-time visibility into costs, financial positions, shipments, production data, sales opportunities, sales orders, receivables, payables, marketing campaigns, sales promotions, inventory levels, and other key performance indicators (KPIs). In traditional architectures, such information is only available hours, days or even weeks after the event.
The "in the moment" availability of these KPIs makes it possible for business leaders to have greater situation awareness and promptly take corrective actions should the indicators point in the wrong direction. For example, supply chain planners can more quickly address supply disruptions or unexpected fluctuations in demand for certain products: if they realize that the company is likely to sell less products than anticipated, they can cancel early on orders to suppliers, thus preventing excessive inventory and limit order cancellation penalties.
Clearly, the impact of being able to get these KPIs in a matter of a few seconds or minutes, or receiving real-time notifications of anomalies or undesired behaviors, can provide huge business benefits via faster and better-informed decision making, but it doesn't link the decisions directly to execution of the business process. The sales, procurement, supply chain, loan origination, claim management or service provisioning business processes — essentially driven by transaction processing applications — still require off-application, human intervention to take the impact of the decision into account.
Implementing point-of-decision HTAP is not easy, both architecturally and technologically. Industry experience is limited, with the required skills still hard to find, and it implies the adoption of IMC technologies, which are still primarily used by leading-edge organizations.
At the same time, the point-of-decision HTAP style is the most approachable by mainstream organizations because:
The established transactional applications (both in-house-developed and packaged) usually don't need to be massively re-engineered. In the worst case, they must be migrated on top of an IMDBMS or IMDG. In the best case, it is sufficient to turn on the IMC capabilities of the DBMS they are built on.
Multiple analytical applications can be implemented in an incremental fashion by starting from simple reporting or dashboards, and later moving to more advanced forms of analytics.
Action items for CTOs:
Demonstrate the business value of point-of-decision HTAP by delivering relatively simple, but impactful, analytical applications that give business leaders real-time visibility into KPIs they can act upon.
Work with business leaders and key domain users to identify the business metrics, KPIs or business event notifications that would deliver the greatest value if the latency of their availability was reduced to minutes or seconds.
Address more challenging initiatives as your organization moves along the business and technical learning curve (business leaders, too, need to understand the opportunities associated with "business real time" decision making enabled by HTAP).
In in-process HTAP, analytical and transaction processing techniques are woven together — in the context of a given application process — so that the outcome of real-time analytics directly influences the execution of the transactional business process itself. In-process HTAP applications incorporate various forms of IMC-based analytics, including predictive and prescriptive analytics, to automatically or semiautomatically (by providing options, for example) guide and optimize the executions of transactional processes.
Unlike point-of-decision HTAP, in an in-process HTAP application, the transaction and analytical processing dimensions are designed together in the context of an individual project management span and are delivered to users as a single application, where transaction processing and analytics functions are designed to interplay (see Figure 2).
HTAP = hybrid transaction/analytical processing
Source: Gartner (June 2016)
Typically, in-process HTAP applications support business users who must perform certain tasks (such as originate loans, fulfill orders, schedule deliveries, reprice products and services, process receivables or payables, close financial books, process claims) that also include making certain decisions.
For example, a fulfillment manager may need to decide, while in the midst of the fulfillment process, how to prioritize the scheduling of a number of purchase orders. An in-process HTAP application would, at the very least, provide some sort of context-aware, automated recommendation based on a real-time analysis of the impact of the different options on inventory levels, profitability and customer satisfaction. Ultimately, in-process HTAP applications could execute the optimal transaction outcome based on the real-time analysis without human intervention.
The in-process HTAP architecture provides the same benefits of point-of-decision HTAP in terms of real-time visibility and situation awareness. However, its impact is potentially profound as — injecting real-time advanced analytics into transactional processes can completely transform the way business processes are usually shaped. In-process HTAP makes it possible for business leaders to perform, in the context of operational processes, much more advanced and sophisticated real-time analysis of their business data than with traditional architectures. Large volumes of complex business data can be analyzed as the processes unfold, thus:
Enabling business users to make more informed decisions in real time without requiring distinct analytical activity
Opening the possibility to drive decisions without human intervention by leveraging context brokering, machine learning, artificial intelligence and other advanced techniques
An aggressive use of in-process HTAP architectures may help to establish sustainable competitive advantage (see the Avanza Bank case study below), and in some cases can support the implementation of new business models (see the DeltaDNA case study ).
Despite its potential for dramatic business innovation, in-process HTAP is still in the very early stages of its life cycle. The challenges associated with its adoption include the following:
Industry experience is still limited to the most leading-edge organizations. Best practices have not yet crystallized and require unusual skills (application architects, analysts and developer mastering both transaction processing and analytics).
Retrofitting existing applications for in-process HTAP requires massive, expensive and risky re-engineering efforts.
IMC technology to support in-process HTAP is widely available, but it has only been proven in a limited number of large-scale initiatives in a few selected verticals (e.g., financial services, telecommunications, gaming and online advertisement).
Action items for CTOs:
Run a proof of concept to familiarize yourself with the in-process HTAP-enabling technologies (e.g., IMC, predictive analytics), design patterns and skills requirements.
Adopt in-process HTAP for "system of innovation" projects focused on achieving business differentiation and competitive advantage.
Explore with business leaders and key domain users opportunities for greater business efficiency and effectiveness via the automated execution of the optimal transaction outcome via real-time analytics.
The two styles of HTAP enable incremental as well as dramatic business innovation. However, IT leaders must carefully plan for their adoption, given the associated architectural, technology, skills and cost challenges.
Not every organization has the risk profile, tolerance to failures, and business and technical skills needed to tackle the powerful but challenging in-process HTAP. A much larger number of organizations are already equipped to take advantage of the less complex, but more tactically oriented, point-of-decision approach (see Note 2). Therefore, many organizations will undertake in-process HTAP only after having successfully delivered business value via a few point-of-decision HTAP projects (see "Predicts 2016: In-Memory Computing-Enabled Hybrid Transaction/Analytical Processing Supports Dramatic Digital Business Innovation" ).
This approach will enable CTOs to build HTAP and IMC competences and skills, better crystallize pros and cons of the two approaches, define technology recommendations, collect the evidence needed to build the business case for strategic in-process HTAP-enabled initiatives, and develop guidelines for the adoption of the two HTAP flavors on a project-by-project basis. These guidelines should consider the key characteristics of the two styles of HTAP in terms of potential business impact, project tolerance to risk, required technology investments and impact on established applications (see Table 1).
Impact on Established Applications
Transaction processing and analytics are respectively segregated into independently designed applications
Tactical or strategic
Minor or major re-engineering
Real-time analytics and transaction processing techniques are woven together in the same application to guide and optimize the execution of transactional processes
Major re-engineering or full redesign
Source: Gartner (June 2016)
Point-of-decision HTAP is likely to be suitable for projects expected to have tactical business impact, low-/midrisk tolerance, moderate technology investments (for example, upgrading to an in-memory-enabled version of the established DBMS environment) and requirements for minor re-engineering of established applications.
In-process HTAP, which typically requires high-technology investments (for example, buying brand new IMC technologies that also require significant retraining or hiring of new skills) and major re-engineering or redesign of established applications, can be better justified for projects, often "greenfield" initiatives, with strategic impact and with high tolerance to risk.
Action items for CTOs:
Contact innovative technology and service providers that emphasize HTAP in their value proposition to discuss the concrete business outcomes obtained by their clients.
Familiarize yourself with the benefits and challenges associated with HTAP and IMC by implementing a few "low-hanging fruit" point-of-decision HTAP applications.
Capitalize on the evidence of the business value from these initial projects to build the business case for the use of more advanced forms of analytics, and for the most challenging in-process HTAP.
When you have accumulated a good knowledge about the relevant enabling technologies, identify an experimental project for in-process HTAP in the context of high-risk/high-reward business initiatives that can tolerate some risk of failure and are expected to pay themselves off rapidly (two/three years).
Develop guidelines for the adoption of the two HTAP styles on the basis of the requirements of specific initiatives, taking into consideration the characteristics summarized in Table 1 .
In technology and application purchases, favor providers that leverage IMC technologies. They are in a better position to deliver business innovation via HTAP than providers that still use conventional approaches.
Plan for a long-term coexistence of in-process and point-of-decision HTAP with traditional architectures. A traditional approach often satisfactorily meets business requirements, while migration to an HTAP architecture sometimes cannot be justified and, often, is not necessarily desirable or even possible.
Avanza Bank of Sweden has been in business since the early 2000s and accounts for more than 500,000 clients, to whom it provides retail banking and trading services via web, mobile and contact center channels. As the bank grew its business, its original application architecture increasingly showed limitations in terms of ability to serve the fast-growing workload and to enable rapid introduction of new services. Thus, in the course of 2011, Avanza decided to redesign its core banking applications on the basis of an all-in-memory, scale-out approach.
The new application system was completely developed in-house by a team of 35 Java developers and went live in the second half of 2013. Powered by GigaSpaces' XAP IMDG-based platform, the application is deployed on a private cloud of commodity x86 servers replicated in two data centers for disaster recovery. As a benchmark, the new architecture demonstrated the ability to support at least 100,000 transactions per second (tps). Moreover, thanks to the new applications' highly modular architecture, and to the adoption of agile application development methods, the bank has dramatically accelerated delivery of innovation to its clients — a new version of the application is deployed every eight working days.
Innovations for clients include several HTAP-enabled services, such as customer self-service analytics and a real-time risk management capability that enables Avanza to provide its clients with highly customized terms and conditions. These IMC-enabled HTAP applications have helped Avanza differentiate from its competitors, win new clients and establish competitive advantage, as no other Swedish bank has been able to match its innovative services, thus far.
DeltaDNA is a company that provides a deep data analytics and real-time marketing platform for mobile, console and PC games. The platform allows game developers and publishers to understand their players, seeing how they interact with the game and engage with them, live as they play. The ability to provide analytical capabilities for game developers and publishers as players interact with the game constitutes a point-of-decision HTAP scenario. Data gathered can be used in conjunction with a range of personalization and targeted marketing tools to create gaming experiences that are tailored to each user.
DeltaDNA, founded five years ago, has over 2,000 clients composed of software companies that develop mobile and online games, social gaming and real money gambling products. The platform tracks more than 30,000 events per second. DeltaDNA provides developers with big data business intelligence that delivers actionable analytics through analysis of player behavior, adapting the game parameters for player segments and offering incentives in real time.
Effectively improving key game metrics like retention, engagement and monetization first requires the collection of vast amounts of data, enabling marketers, data scientists and developers to act upon the subsequent insights generated. The platform can take the data generated from the player and respond in milliseconds, depending on the parameters set by the user via in-game messages or game balancing.
The deltaDNA platform is cloud-based and offers over 900 metrics to track player behavior. Real-time analysis can also be used for A/B testing, leveraging standard and custom events to create funnels and reporting.
Detailed event data from the games on the platform is stored for historical analysis and segmentation of players. This data can then be used to set up real-time triggers to customize the gaming experience. Engage calls can also use, for example, predictive models to propose the next-best action to keep the player engaged, recommending a mission or inventory in the store. The ability to automate the gaming experience based on real-time gaming data constitutes an in-process HTAP use case.
Real-time data collection and interactive analysis of the data are performed on VoltDB, an in-memory DBMS. Historical analysis of data uses HPE Vertica, a data warehouse DBMS technology. User segments, resulting from the pattern analysis of the gaming data, are fed back to VoltDB by loading a player segmentation table. MongoDB, a NoSQL DBMS, is used as the granular data persistency store of all data behind HPE Vertica. Data stored in MongoDB is used for historical analysis over longer time periods.
In the free-to-play economy, it is vital to maximize player retention and lifetime value. Using deltaDNA, game publishers can make environments that are responsive to all types of players, creating great experiences for everyone.
In an HTAP-style system, the transactional and analytical dimensions may work on a centralized or distributed set of data. For technical reasons, they may actually process different copies of this data that must be synchronized in real time. For example, changes in operational data in an "on disk" row store are replicated in real time into an in-memory column store, as supported by traditional DBMSs such as IBM DB2 with BLU Acceleration or the Oracle Database In-Memory option.
The essential point is that, for all practical purposes, every change in the data performed by the transactional dimension has an immediate impact on the outcome of the analytical side of the system, as if the two sides of the HTAP systems shared the same data infrastructure.
Of course, to perform analytics, HTAP applications may need to have access to other data than that shared with the transactional environment; for example, to gather more context or to add an historical perspective. This external data can both be in a different transactional data store or in an analytical data store. The essential point is that the primary data for analysis is readily available as shared with the transactional environment.
HTAP-enabled, real-time analytics works on fast-changing data "at rest" (e.g., sales orders being entered in CRM systems, deposits/withdrawals in core banking systems, flight tickets being booked) rather than on data "on the move" (e.g., social feeds, web clickstreams, market data feeds and sensor data) like in event stream processing. HTAP, stream processing and other techniques complement each other to assist decision making.
Most organizations will likely prefer to initially focus on delivering incremental, tactical business value by first implementing relatively simple, but high business impact, point-of-decision HTAP applications. This will typically require retrofitting existing transactional applications for IMC as a preliminary step (for example, by migrating a packaged application to an IMC-enabled version). However, such retrofitting requires efforts and costs that often must be justified above and beyond the benefits that could be potentially delivered by (still to be developed) point-of-decision HTAP applications. Those additional benefits may include faster execution of established long-running business processes (for example, MRP or financial closures) or greater scalability for externally focused applications such as e-commerce and travel reservation.
By now, a significant number of organizations worldwide have migrated their (typically transactional) SAP Business Suite deployments on top of the SAP Hana IMDBMS. They can therefore leverage the SAP Hana Live technology to implement analytics applications against the operational ERP data stored in their SAP environment, according to a point-of-decision HTAP approach.
Moreover, Oracle has released several point-of-decision HTAP add-on applications for a variety of the company's packaged applications (Oracle E-Business Suite, Financials Cloud, JD Edwards, Siebel and PeopleSoft). Organizations using these applications are in an optimal position to experience the benefits of point-of-decision HTAP. They should bear in mind, however, that these add-ons optimally run on the Oracle Database 12 c in-memory option, so an upgrade of their Oracle applications DBMS environment may be required.