Maximize Quantum Machine Learning with Simple Data

Machine Learning


Newswise — Los Alamos, N.M., July 5, 2023 — New theoretical study proves machine learning on quantum computers requires much simpler data than previously thought . This discovery paves the way for maximizing the usefulness of today’s noisy and medium-scale quantum computers for simulating quantum systems and other tasks better than classical digital computers, while at the same time optimizing quantum sensors. It also offers optimization possibilities.

“We have shown that a small amount of surprisingly simple data is sufficient to train a quantum neural network,” said Lukasz Cincio, a quantum theorist at Los Alamos National Laboratory. He is a co-author of a paper containing evidence published in Nature Communications. “This effort marks another step in the direction of making quantum machine learning easier, more accessible, and shorter.”

The new paper is the result of a collaboration between the Los Alamos team, lead author Matthias Kallo of the Free University of Berlin, and other researchers in the US, UK and Switzerland. The group argues that while the industry is working to improve the quality and size of quantum computers, it is also important to develop more efficient algorithms, especially quantum machine learning, to take advantage of the capabilities of these noisy machines. I have developed the foundation.

The new research paper builds on previous work by Los Alamos National Laboratory and its collaborators that demonstrated that training a quantum neural network requires only a small amount of data. Taken together, these recent theoretical breakthroughs enable us to organize training on a very small number of very simple states, allowing us to achieve more on today’s limited quantum computers than on conventional classical physics-based computers. It proves that it can offer a special approach to get real work done fast.

“Previous studies have considered the amount of training data in quantum machine learning, but here type It’s part of the training data,” Caro said. “Even if we restrict ourselves to simple types of data, it has proven that few training data points are sufficient.”

“Practically speaking, this means that you can train a neural network on very simple pictures, not just a few cat pictures, for example,” says Cincio. “For quantum simulation, this means that we can train in quantum simple states.”

“These states are easy to prepare, which makes it much easier to run the entire learning algorithm on a short-term quantum computer,” said co-author Physics Professor and former Los Alamos postdoctoral fellow at the Polytechnic University of Lausanne. Zoe Holmes said.

Near future applications of quantum computers

Noise in the form of interactions between qubits (qubits) and the surrounding environment causes errors that limit the processing power of current quantum computer technology. Despite the noise, quantum computers excel at certain tasks, such as simulating quantum systems in materials science and classifying quantum states with machine learning.

“If you’re classifying quantum data, there’s a certain amount of noise that you can tolerate and still get the right answer,” says Cincio. “That’s why quantum machine learning could be a good application in the short term.”

Paper co-author Andrew T. Thornboger said that tasks like classification, which are the cornerstone of machine learning, don’t require 100% accuracy to produce useful results, so quantum machine learning can be applied to other types. said that it can tolerate more noise than the algorithm of Thornbolger is the Quantum Algorithm and Simulation Advancement Leader at the Center for Quantum Science. Led by Oak Ridge National Laboratory, the center is a collaboration of national laboratories, including Los Alamos, universities and industry.

A new paper shows that using simpler data, less complex quantum circuits, such as quantum chemical simulations showing the evolution of molecular systems, can prepare specific quantum states on a computer. A simple circuit is easy to implement and has low noise so the calculation can be completed. A new paper in Nature Communications shows how easy-to-prepare states can be used to compile quantum machine learning algorithms.

Offloading to classic computers

Complex quantum algorithms exceed the processing power of even very large classical computers. However, the team also found that quantum algorithm compilation could be offloaded to classical computers, as the new approach simplifies algorithm development. The compiled algorithm then runs successfully on a quantum computer. This new approach allows programmers to use quantum computers for tasks that they can perform on their own but interfere with the operation of classical computers, such as simulating quantum systems, while avoiding the noise that causes errors due to long circuits on quantum computers. You can reserve resources for

The Institute’s research has applications in the developing field of quantum sensing. Certain principles of quantum mechanics make it possible to create very sensitive devices, for example for measuring gravitational and magnetic fields.

“Quantum sensing techniques in the absence of noise are simple and well-understood in theory, but the situation becomes more complex when noise is taken into account,” Thornbolger said. “The addition of quantum machine learning to quantum sensing protocols makes it possible to apply the method when the encoding mechanism is unknown or when hardware noise affects quantum probes.” Applications are under investigation in a Department of Energy-sponsored project led by Lukas Cincio and Marco Cerezo, also from Los Alamos.

Paper: “Out-of-distribution generalization for learning quantum mechanics. Authors: Matthias C. Caro, Hsin-Yuan Huang, Nicholas Ezzell, Joe Gibbs, Andrew T. Sornborger, Lukasz Cincio, Patrick J. Coles, Zoe Holmes. Nature Communications. DOI: 10.1038/s41467-023-39381-w

Funding: Funding for Los Alamos National Laboratory research is provided by the Los Alamos Lab-Driven Research and Development Program, the Los Alamos Beyond Moore’s Law Project, and the Department of Energy, Office of Science, Office of Advanced Science Computing Research. I was. Quantum computing program research.

-30-

LA-UR-23-26249





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *