AWS Quantum Technologies is focused on new research demonstrating the power of Rydberg atomic quantum computers in machine learning. QuEra Computing researchers and collaborators have successfully implemented quantum reservoir computing (QRC) algorithms on Amazon Braket to address challenges in areas such as image classification and time series prediction. This approach leverages the inherent properties of quantum mechanics to potentially overcome limitations faced by traditional machine learning methods, especially when dealing with small datasets. According to a February 9, 2026 blog post, the team observed “robust QRC performance on small datasets relevant to pharmaceutical research,” suggesting a path toward more efficient analysis in critical areas. This research, detailed in a recent post, offers a glimpse into the future of quantum machine learning and its potential to accelerate discovery.
Quantum storage computing using Rydberg atoms
This approach, detailed in a recent study, moves beyond a theoretical proposal toward practical implementation in quantum hardware in the near future. The core of QRC lies in its ability to map input data into a high-dimensional space, or “reservoir configuration space,” before the readout layer interprets the results. Unlike many other quantum machine learning (QML) techniques, QRC minimizes training demands by keeping reservoir parameters fixed. This is especially beneficial when addressing machine learning scaling challenges. “ML continues to suffer from increasing scale and complexity.” The team’s research builds on established principles of classical reservoir computing and adapts them to quantum systems. As the authors explain, Reservoir “can be thought of as a programmable analog computer that is used as a subroutine to perform specific machine learning tasks.”
The experimental implementation utilizes the Rydberg atom, a two-level system with tunable position and local detuning. Encoding the input data into these parameters evolves the quantum system and creates a data embedding vector. Although this process mirrors the classic approach, there are crucial differences. “QRC shares the same workflow as CRC, but using a quantum system as a reservoir allows access to state space beyond the product states, enabling long-range quantum correlations not available classically.” In one demonstration, the QRC algorithm achieved a test accuracy of 83.5% on a 3/8-MNIST binary classification task using a chain of nine atoms. This performance is comparable to classical techniques such as feedforward neural networks, but could potentially scale better. Further experiments considered classifying tomato diseases using leaf images using up to 108 atoms. The results showed that performance was observed up to 108 atoms. Although the current results do not conclusively outperform state-of-the-art classical machine learning, the team observed that QRC “exhibits better scaling in terms of the number of atoms used in the encoding.” In particular, this work extends beyond image classification to demonstrate the applicability of the algorithm to time series prediction, reflecting the dynamics of one physical system to simulate another. “Since the computational power of reservoir computing comes from the temporal dynamics of physical systems, it is natural to apply this framework to problems such as time series forecasting,” the researchers note.
Classical reservoir computing for MNIST images
The pursuit of more powerful machine learning algorithms is driving exploration beyond traditional architectures, with reservoir computing emerging as an attractive alternative. This paradigm, unlike many deep learning approaches, minimizes training demands by exploiting the inherent dynamics of fixed nonlinear systems (“reservoirs”) to process information. Initially studied using classical systems, the potential of quantum mechanics to power reservoir computing has attracted significant interest, but classical implementations remain important for creating benchmarks and understanding the core principles.
An important example is in tackling image classification challenges, particularly the widely used modified National Institute of Standards and Technology (MNIST) handwritten digits dataset. QuEra Computing researchers and collaborators have demonstrated a classical reservoir computing (CRC) approach that uses chains of Nq classical spins to classify these images. The process begins by converting each image into an Nq-dimensional feature vector and then setting the site-dependent longitudinal magnetic field as Δi = Δmax xi. This establishes a reservoir whose behavior is directly influenced by image features.
Measurable properties from the reservoir evolution, such as the Z component of each spin and the correlation between them, are then compiled into a data embedding vector. This data embedding vector is used to train the final readout layer (usually a simple linear regression model), which maps the reservoir’s internal state to the desired output classification. The advantage, as highlighted by the researchers, is that “training costs are low because the reservoir parameters remain fixed and only the readout layer requires training.” This is in clear contrast to the intensive parameter tuning required in many traditional neural networks. Importantly, the team compared QRC’s performance to other methods such as linear support vector machines (SVMs).
Rydberg atomic interactions and positional encoding
By exploiting the unusual properties of the Rydberg atom, researchers at QuEra Computing are pioneering new approaches to machine learning, with a particular focus on quantum reservoir computing (QRC). Each atom, a two-level system with tunable position, experiences local detuning similar to the magnetic field that affects classical spin. This setup allows image features to be encoded directly into the spatial arrangement of atoms. As the system evolves, researchers measure observations of Pauli Z and i
Beyond simple number recognition, the team applied QRC to a more complex task: classifying tomato diseases from images of leaves. This required scaling up to up to 108 atoms, each representing a pixel in the downscaled image. Results showed that QRC achieved accuracy levels comparable to a four-layer neural network with approximately 20,000 hidden parameters, with performance observed up to 108 atoms and 400 shots per data point. The researchers highlight that QRC’s scaling behavior as system size increases is promising, while acknowledging that “known classical methods perform better than the linear SVM and four-layer NN used in the benchmark.”
Quantum reservoir computing (QRC) has demonstrated potential in machine learning, especially when dealing with limited datasets, a common challenge in fields such as early-stage pharmaceutical research. A recent study by QuEra Computing and collaborators focused on implementing QRC using a Rydberg atomic quantum computer and achieved a test accuracy of 83.5% on the challenging 3/8-MNIST binary classification task. Although the results do not overall exceed established classical methods, they highlight important aspects of QRC. This means that its performance is comparable to techniques such as feedforward neural networks, especially when training data is scarce.
The team’s experiments utilized up to 108 atoms in the simulations and revealed performance levels similar to classical reservoir computing (CRC) and the aforementioned neural networks. Unlike many traditional machine learning algorithms, only this last readout layer requires training. The reservoir itself remains fixed, which significantly reduces computational costs. To achieve this with the Rydberg atom, the researchers encoded an image into the atom’s position and tailored the atoms’ interactions to reflect the features of the image. The system then evolves over time, and measurements of Pauli Z observations create data embedding vectors used for classification. However, this work provides a valuable proof of concept and demonstrates the feasibility of QRC implementation on quantum hardware in the near term, paving the way for further exploration of its potential in specialized applications.
This study shows that quantum reservoir computing on Rydberg atomic systems can match or outperform classical methods for certain ML tasks, especially when training data is limited.
Machine learning algorithms are increasingly mainstream in areas such as image recognition and financial modeling, but applications in specialized fields such as pharmaceutical research often hit a wall when dealing with limited data. This is not to completely replace established methods, but rather to provide a viable alternative when classical methods reach an impasse. The research, detailed in a recent post, builds on the principles of reservoir computing, a machine learning paradigm in which a fixed nonlinear system (a “reservoir”) maps input data into a high-dimensional space. The team applied this concept to a quantum system using Rydberg atoms, effectively creating an analog quantum computer. The workflow of the QRC algorithm begins by converting the input data into a feature vector, which is then encoded into the Rydberg system using tunable parameters.
