Breakthrough Accelerates Quantum AI | Yulek Alert!

Machine Learning


LOS ALAMOS, N.M., June 26, 2023 — A breakthrough theoretical proof shows that a technique called overparameterization improves the performance of quantum machine learning for applications that hamper classical computers. it was done.

“We believe our results can be useful in using machine learning to learn the properties of quantum data, such as the classification of different phases of matter in quantum materials research, which is similar to the classical It’s very difficult with computers,” said postdoctoral researcher Diego Garcia Martin. At Los Alamos National Laboratory. He is co-author of a new paper on this technology by Ross and his Alamos team. natural computational science.

As a graduate student at the Autonomous University of Madrid, García Martin will work at the Institute’s Quantum Computing Summer School in 2021.

Machine learning, or artificial intelligence, usually involves training neural networks to learn how to process information (data) and solve specific tasks. In a nutshell, a neural network can be thought of as a box with knobs or parameters that takes data as input and produces an output depending on how the knobs are configured.

“During the training phase, the algorithm updates these parameters while learning and tries to find the best settings,” said Garcia-Martin. “Once the optimal parameters have been determined, the neural network should be able to train what it has learned from the instances to extrapolate to new, never-before-seen data points.”

Both classical and quantum AI have common challenges when training parameters, as the algorithm can reach suboptimal configurations during training and stall.

breakthrough performance

Over-parameterization, a well-known concept in classical machine learning, keeps things from stalling by adding more and more parameters.

The impact of overparameterization in quantum machine learning models has so far been poorly understood. In a new paper, Ross and his Alamos team establish a theoretical framework for predicting the critical number of parameters to which a quantum machine learning model will be overparameterized. In certain important respects, adding parameters to the network dramatically improves his performance and makes the model much easier to train.

“By establishing the theory behind overparameterization in quantum neural networks, our work paves the way for optimizing the training process and achieving performance gains in practical quantum applications,” says the manuscript. Martin LaRocca, lead author and postdoctoral fellow at Los Alamos, explained.

Quantum machine learning promises much faster, or quantum, advantages than machine learning on classical computers by exploiting aspects of quantum mechanics such as entanglement and superposition.

Avoiding Traps in Machine Learning Environments

To explain the Los Alamos team’s findings, Marco Cerezo, the paper’s lead scientist and a quantum theorist at the institute, said that hikers looking for the tallest mountain in a dark landscape undergo a training process. I explained the thought experiment that it represents. The hiker can only walk in certain directions and assesses progress by measuring altitude using his limited GPS system.

In this analogy, the number of parameters in the model corresponds to the directions a hiker can travel, Cerezo said. “One parameter allows forward and backward movement, two parameters allow sideways movement, etc.,” he said. Data His landscape may have more than three dimensions, unlike our hypothetical hiker’s world.

With too few parameters, walkers can’t explore thoroughly, mistaking small hills for the tallest mountains, or getting stuck in flat areas where every step seems wasted. However, as the number of parameters increases, the pedestrian will be able to move in more directions in higher dimensions. What initially appeared to be local hills may turn out to be raised valleys between peaks. Additional parameters help hikers avoid falling into traps and find true peaks, solutions to problems.

paper: “Theory of Overparameterization in Quantum Neural Networks” Martin LaRocca, Nathan Ju, Diego Garcia-Martin, Patrick J. Coles, Marco Cerezo. Computational science of nature. DOI: 10.1038/s43588-023-00467-6

Funding: Los Alamos National Laboratory Institute of Information Science and Technology and the Los Alamos Laboratory Led Research and Development Program.

-30-

LA-UR-23-26756


Disclaimer: AAAS and EurekAlert! We are not responsible for the accuracy of news releases posted on EurekAlert!. Use of information by contributing institutions or via the EurekAlert system.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *