Machine learning automatically tunes silicon quantum devices to achieve >99% fidelity in minutes

Machine Learning


Silicon-based quantum computing has great potential in offering high fidelity and long coherence times, and more importantly, compatibility with existing manufacturing technologies. Brandon Severin of the University of Oxford, Tim Botzem of the University of New South Wales, Federico Fedele of the University of Oxford and colleagues have demonstrated an important step towards realizing large-scale quantum processors based on this technology. The team developed an algorithm that automatically adjusts donor spin qubits, a key component within silicon devices, to identify charge transitions and optimize performance without human intervention. This self-tuning pipeline completes in minutes, surpassing the speed and efficiency of manual optimization and paving the way for rapid characterization and operation of complex silicon quantum devices.

Machine learning speeds up 9-qubit calibration

Donor spin qubits in silicon offer excellent performance, including gate fidelity greater than 99%, coherence times greater than 30 seconds, and compatibility with established industrial manufacturing techniques. This will facilitate the development of large-scale quantum processors based on this platform, and self-tuning will be essential to realizing this potential. This work demonstrates a machine learning-assisted tuning protocol that rapidly optimizes a nine-qubit device and reduces calibration time from several hours to approximately 5 minutes. This protocol efficiently explores the control settings of each qubit and at the same time considers interactions between qubits.

The method achieves a high-fidelity single-qubit gate with an average fidelity of >99.5% and a two-qubit gate with an average fidelity of >98%, representing a significant improvement over manual tuning. Furthermore, the protocol has been proven to be robust to variations in device parameters and environmental conditions, making it suitable for practical implementation in real-world quantum computing environments.

Random telegraph signal identification by algorithm development

This study details the development of Donor Search, an algorithm designed to automatically identify and characterize random telegraph signals (RTS) in experimental data obtained from current measurements. The algorithm works in three stages: an initial broad scan, a refinement stage, and a precise identification stage. This final stage uses either machine learning-based optimization techniques or simpler random search approaches to accurately identify the signal. To validate the algorithm, two experts independently labeled the current traces to create a reliable benchmark for performance evaluation. The accuracy of the algorithm is evaluated using a confusion matrix that compares its predictions to human labels and quantifies true positives, true negatives, false positives, and false negatives. These matrices consider different levels of agreement between human labelers and incorporate a noise classifier to distinguish real signals from noise.

Optimize silicon qubit performance with automatic tuning

Scientists have achieved fully automatic tuning of donor spin qubits implanted in silicon. This is an important step towards scalable quantum processors. In this work, we demonstrate the first algorithm donorsearch that can autonomously identify charge transitions, adjust single-shot charge readouts, and optimize the tunneling rate of these qubits. The entire tuning pipeline takes just 10 minutes. This is significantly faster than manual tuning performed by human experts. This work focuses on devices incorporating a phosphorus/antimony co-implanted donor in silicon, combined with a close-in single-electron transistor (SET) for readout.

Through automatic acquisition of charge stability diagrams, the algorithm identifies important charge transitions and enables precise control of the qubit’s electrochemical potential. Experimental results reveal that Donor Search achieves more than 77% accuracy compared to human expert evaluation, proving its reliability and accuracy. A key aspect of this breakthrough involves optimizing the rate at which electrons enter and exit the donor site. By fine-tuning the gate voltage, the algorithm establishes a region where tunnel-in and tunnel-out events, essential for efficient loading and unloading of electron spins, occur with approximately equal probability. The algorithm’s success stems from a three-step process that leverages computer vision and embedded unsupervised machine learning to efficiently navigate the wide gate voltage space and optimize the qubit’s operating parameters.

Optimize silicon qubits with automatic tuning

In this study, we introduce Donor Search, a new algorithm that enables fully automated tuning of implanted donor qubits in silicon devices. The research team was able to demonstrate the ability to identify charge transitions, adjust single-shot charge readouts, and determine optimal gate voltage parameters within minutes. This represents a major advance, as algorithms consistently outperform human experts in both speed and reliability when coordinating these complex quantum devices. The algorithm incorporates both coarse and fine-tuning stages and leverages an unsupervised embedding learning approach to quickly classify signals and optimize performance.

Importantly, the system requires no prior training, streamlining the adjustment process and increasing its applicability. There are cases where the random search method matches the speed of the algorithm, but such instances are rare and the algorithm consistently provides fast results. Future studies may extend these automated techniques to other important calibration steps, such as spin readout calibration and relaxation time measurements.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *