CIOs need to understand the disruptive power of quantum computing and potential applications in AI, machine learning, and data science.
What happens when a computer can break your encryption algorithms nearly as fast as you can create new ones? This is just one of the fears about quantum computing. Is it real, or is it hype?
Quantum computing is currently rising on the Gartner Hype Cycle, and could become one of the most impactful disruptions of the modern era. Most people’s eyes glaze over when you try to explain the concept. And, yet, those in the know also fear the power of that same concept. How can something so misunderstood be poised to affect so many different things across the spectrum of science and computing?
For CIOs, it’s critical to understand the reality of the quantum computing disruption and how it might become a practical platform for computing, machine learning (ML) and artificial intelligence (AI) in the digital age.
“Quantum computing is heavily hyped and evolving at different rates, but it should not be ignored,” says Matthew Brisse, research vice president at Gartner. “It holds great promise, especially in the areas of ML, AI and cryptography. Today’s data scientists, focused on ML, AI and big data analytics, simply cannot address some difficult and complex problems because of the compute limitations of classic computer architectures.
“Some of these problems,” he says, “May take today’s fastest supercomputers months or even years to run through a series of permutations, making it impractical to attempt. Quantum computers have the potential to run massive amounts of calculations in parallel in seconds. This potential for compute acceleration, as well as the ability to address difficult and complex problems, is what is driving so much interest from CEOs in a variety of industries.”
What is quantum computing?
Quantum computing is a type of nonclassical computing that is based on the quantum state of subatomic particles. Quantum computing is fundamentally different from classic computers, which operate using binary bits. This means the bits are either 0 or 1, true or false, positive or negative. However, in quantum computing, the bit is referred to as a quantum bit or qubit. Unlike the strictly binary bits of classic computing, qubits can represent 1 or 0 or a superposition of both partly 0 and partly 1 at the same time.
Qubits can be linked with other qubits in a process called entanglement.
Superposition is what gives quantum computers speed and parallelism, meaning that these computers could theoretically work on millions of computations at once. Further, qubits can be linked with other qubits in a process called entanglement. When combined with superposition, quantum computers could process a massive number of possible outcomes at the same time.
The number of qubits necessary to make a quantum computer viable depends on the problem. For example, Google engineers have said a 49-qubit quantum computer would be able to solve a problem that the world’s largest super computer cannot solve today. The ability for a quantum computer to outperform a classical computer is called “quantum supremacy.” While it may sound like a sci-fi dream, experts believe that for a few computing problems, quantum supremacy will be a reality in a matter of years.
Is encryption at risk?
For example, some scientists have speculated that quantum computing would kill, or at least significantly weaken, cryptography. If true, this would jeopardize any business that relies on encryption. If a sufficiently powerful quantum computer becomes available within 10 or so years, any data that has been published or intercepted is subject to cryptanalysis by a future quantum computer. Most security professionals speculate that quantum computing will eventually render RSA cryptography and ECC useless but will not be able to effectively counter hash, code, lattice-based or multivariate-quadratic-equations cryptography until it has matured. Symmetric key cryptographic systems like Advanced Encryption Standard (AES), SNOW 3G, 3GPP and Kerberos are resistant to a quantum computing attack if they use a large enough key size. The problem is, we don’t exactly know how large a key will be needed in the future.
However, it’s important to note that quantum computers will never replace classic computers for general-purpose computing. They are probabilistic and not deterministic, but they do have a narrow set of algorithms, including optimization for ML and AI and factoring large numbers for security purposes, for which they could be well-suited for in the future.
What are the potential applications and impacts?
Despite the hype, today this technology is experimental and nascent. On the Gartner Hype Cycle for Emerging Technologies 2017, quantum computing is climbing the Innovation Trigger phase. It currently offers limited business applications and can only run very specific quantum algorithms. Further, the equipment is expensive and fragile and lacks standardization, with materials and designs varying wildly.
Gartner predicts that quantum computing as a service (QCaaS) will be the predominant method used by data scientists to obfuscate this risk. Gartner recommends that organizations focus on QCaaS to gain experience with quantum algorithms as they apply to business solutions. Because it’s a new field, everything must be built from the ground up, plus it’s difficult to even comprehend the potential of the technology or the problems QCaaS could potentially solve.
However, the potential that quantum computing has for solving problems in ML, AI and big data, where classic computing limits potential, is driving a lot of innovation and growth among data scientists. Investors are putting millions of dollars toward the technology, and more than 50 companies, universities and research companies are working on development.
What are the applications?
Current applications and, according to Gartner, future applications for quantum computing will be narrow and focused. General-purpose quantum computing will never be realized. However, the technology does hold the potential to revolutionize certain industries, including AI, cryptography, and even weather prediction.
For example, billions and billions of IoT devices are providing petabytes/second of information, most of which is discarded because of the storage requirements. For example, a weather prediction model might require millions of IoT devices, sensors and external feeds such as satellite imagery and radar information all transmitting continuous data that ideal could be analyzed instantaneously. Do to this, all this information would have to be loaded directly into quantum memory resulting in immediate analysis. This continuous analyses would be able to provide meteorologist with more accurate weather forecasting.
- Machine learning: Improved ML through faster structured prediction. Examples include Boltzmann machines, quantum Boltzmann machines, semi-supervised learning, unsupervised learning and deep learning.
- Artificial intelligence: Faster calculations could improve perception, comprehension, self-awareness and circuit fault diagnosis/binary classifiers.
- Finance: Quantum computing could enable faster, more complex Monte Carlo simulations, for example, trading, trajectory optimization, market instability, price optimization and hedging strategies.
- Healthcare: DNA gene sequencing, such as radiotherapy treatment optimization/brain tumor detection, could be performed in seconds instead of hours or weeks.
- Computer science: Faster multidimensional search functions, for example, query optimization, mathematics and simulations.
Not every CIO needs to worry about quantum computing, but for now, those looking to explore the technology should focus their data scientists on the advancement of quantum algorithms and how they can be applied to solve practical business problems. Quantum programming will require a significant learning curve. Gartner recommends getting ahead of the curve by leveraging QCaaS, GitHub tools and SDKs. Applying quantum algorithms to real-world problems will provide the greatest competitive advantage well into the future.
Gartner clients can read more in the full research report Quantum Computing: A Research Project or a Practical Computing Architecture for Machine Learning? by Matthew Brisse, et al.
Gartner CIO Events
Learn more at Gartner Global Summits and Gartner Symposium/ITxpo.Explore Gartner Events
Top 10 Strategic Technology Trends for 2018
The intelligent digital mesh is a foundation for future digital business and its ecosystems. To create competitive advantage, enterprise...Read Free Research
Will the Cloud Save Me Money? Or Am I About to Waste a Lot?
The cloud is often seen as a great way of saving money, but there is an emerging trend of organizations that cannot prove a ROI. We examine...Start Watching