Skip to main content Skip to footer

Quantum Intelligence: The dawn of a new computational age

By Simon Forrest, Principal Analyst, Core Technology, Futuresource Consulting

At Futuresource, we’re continually looking towards that distant horizon, forecasting trends, and seeking out the first signs of market opportunity or disruption. While difficult to accurately predict, it’s becoming more probable that the next revolution in computing will occur in the 2030s. Indeed, there is strong expectation that a convergence between artificial intelligence (AI)  and quantum computing – often referred to as Quantum AI, or Quantum Intelligence – may reshape entire industries, economies, and even the way we understand intelligence itself.  What began as independent experimental research into quantum algorithms and machine learning is likely to mature into a symbiotic relationship: AI guiding quantum systems, and quantum architectures helping to accelerate AI.

The development of artificial intelligence

Research in artificial intelligence began in the 1950s, with early systems focused on symbolic reasoning and rule-based logic. These approaches demonstrated limited scalability and adaptability to real-world complexity.

The period between the late 1970s and early 1990s became an “AI winter”, a term first used in 1984 that describe the gap between AI expectations and the technology’s shortcomings. 

Subsequent advances in machine learning (ML) during the 1990s and deep learning in the 2010s significantly expanded AI capabilities, enabling progress in applications such as speech recognition and voice synthesis, computer vision, natural language processing, and now even more advanced capabilities including large language models, computational photography, even video and music creation.  These advances were largely driven by increased data availability and improvements in classical hardware, particularly graphics processing units (GPUs), which enabled large-scale model training.

Despite these successes, several fundamental challenges persist:

  • Compute resource and sustainability: Training state-of-the-art AI models requires substantial computational capability and energy consumption.
  • Interpretability: AI models often lack transparency, operating as a “black box” which limits trust and explainability in high-stakes applications.
  • Optimisation complexity: AI relies heavily on solving high-dimensional optimisation problems to train neural networks, that often scale poorly with model size.
  • Limits of classical computing: Certain problem classes, such as combinatorial optimisation and high-dimensional simulation, remain intractable for even the most powerful classical supercomputers today.

These types of limitations are motivating exploration of alternative computational paradigms for AI, including quantum computing.

Progress towards “quantum advantage”

Quantum computing is grounded in quantum mechanics, a field of physics that investigates the behaviour of particles at subatomic scale, leveraging complex phenomena such as superposition, entanglement, and interference to process information.

Unlike classical computer bits, which can be either 0 or 1, qubits (quantum bits) can exist in a state of superposition, being both 0 and 1 at the same time. Through entanglement, qubits can be linked together in ways impossible for classical bits and process data at massive scale.  As a result, quantum computers have the potential to solve complex problems far faster than classical computers.

The theoretical foundations of quantum computing were established in the 1980s, but practical development has accelerated only in the past decade.  Current quantum devices are classified as noisy intermediate-scale quantum systems, characterised by limited qubit counts and significant error rates, which has limited the capability.  But companies including IBM, Google, Microsoft, even Amazon, amongst specialists Quantinuum, D-Wave Systems and Xanadu are now building practical quantum computers with increasing numbers of stable qubits.

A central milestone in the field is the demonstration of quantum advantage, defined as the ability of a quantum computer to outperform classical systems on a practically relevant task. To date, this milestone has not been conclusively achieved for commercially viable applications. However, steady progress in qubit quality, control systems, and error mitigation suggests that quantum processors may become increasingly useful as accelerators within hybrid classical–quantum workflows.

The intersect of quantum computing with AI

Research into Quantum AI focuses on whether quantum algorithms can address some of the computational bottlenecks faced by classical machine learning.  Proposed avenues include:

  • Faster AI training: Quantum computers can accelerate complex matrix calculations, reducing the time needed to train AI models.
  • More efficient data processing: Quantum machine learning techniques could classify and analyse large datasets more effectively than classical approaches.
  • Improved model optimisation: New quantum algorithms could enhance tasks such as tuning machine learning models.
  • Enhanced natural language processing: Quantum-enhanced language models promise to process linguistics more efficiently, leading to better translation systems and AI assistants.
  • Breaking classical limits: Quantum computing allows AI to address problems that are difficult for classical hardware, including diverse fields such as computational chemistry and materials science.

It is important to note that many of these approaches remain theoretical or limited to small-scale demonstrations, and evidence of quantum advantage for AI tasks is still to be determined.

The influence of AI on quantum computing

The relationship between AI and quantum computing is bidirectional.  AI techniques can increasingly be applied to improve the performance, usability, and scalability of quantum systems.

  • Semiconductors and silicon chip design: AI methods can inform processor design choices across both classical and quantum computing architectures.  This indirectly creates a link between hardware development and algorithmic efficiency.
  • Error correction and mitigation: Quantum systems are highly susceptible to noise.  AI-based solutions are emerging as leading candidates for fault-tolerant quantum computing, enabling more accurate identification and correction of errors, which in turn minimises the number of qubits required.
  • Automated calibration: Machine learning methods optimise calibration routines for quantum hardware, reducing manual intervention and improving system stability.  This provides benefits for operational efficiency and reproducibility.
  • Algorithm creation: AI-driven tools can optimise how quantum algorithms are mapped onto specific hardware architectures, improving performance.

Inflexion point

The intersect between quantum and AI will depend upon several technological advancements.  Improved quantum hardware is crucial, not just for AI but to gain quantum advantage more widely.  This means developing computers with more qubits and, importantly, improved stability of those qubits to reduce error rates and enable processing of ever larger datasets.  Meantime, there is advantage in hybrid computing architectures, offering a combination of classical computation and quantum to deliver the strengths of both approaches.

And, of course, this must all be commercially viable, providing real world value to early adopters of the technology, which in turn helps finance future development.  This is already being achieved by offering organisations and educational establishments access to quantum computing via the cloud which, rather coincidentally, reflects how compute time was awarded on mainframes in the 1950s when AI research was first becoming established.

Just as the rapid expansion of AI has been enabled by widespread access to high-performance GPU and NPU (neural) computing, Quantum AI looks likely to experience a similar inflexion point once scalable quantum systems become widely available.  Hence this represents a promising research domain at the intersection of two truly transformative technologies, both of which Futuresource are closely monitoring.

For more information visit https://www.futuresource-consulting.com

Press Contact: 

Nicola Finn, Marketing Manager, Futuresource Consulting, nicola.finn@futuresource-hq.com

Find out more about how this website uses cookies to enhance your browsing experience.