A new frontier of autonomous control

Machine Learning


Illustration of machine learning AI technology

Advanced machine learning algorithms have shown promise for efficiently controlling complex systems, and promise significant improvements in autonomous technology and digital infrastructure.

Recent research has revealed the development of advanced technology. machine learning Algorithms that can efficiently control complex systems. These new algorithms, tested on digital twins of chaotic electronic circuits, not only effectively predict and control these systems, but also significantly improve power consumption and computational demands.

Systems controlled by next-generation computing algorithms have the potential to produce better and more efficient machine learning products, new research suggests.

Researchers are using machine learning tools to create digital twins, or virtual copies, of electronic circuits that exhibit chaotic behavior, predict how the circuits will behave, and use that information. We discovered that we were able to successfully control the

Limitations of linear controllers

Many everyday devices, such as thermostats and cruise controls, utilize linear controllers that use simple rules to guide the system to a desired value. For example, a thermostat uses such rules to decide how much to heat or cool a space based on the difference between the current temperature and the desired temperature.

However, these algorithms are so simple that it is difficult to control systems that exhibit complex behavior, such as chaos.

As a result, advanced devices such as self-driving cars and aircraft often rely on machine learning-based controllers, which use complex networks to learn the optimal control algorithms needed for optimal operation. However, these algorithms have significant drawbacks, the most demanding being that they can be very difficult and computationally expensive to implement.

The impact of an efficient digital twin

Having access to efficient digital twins now is likely to have a major impact on how scientists develop future autonomous technologies, said the study's lead author and a graduate student in physics at The Ohio State University. Robert Kent said.

“The problem with most machine learning-based controllers is that they consume a lot of energy or power and take a long time to evaluate,” Kent says. “It was also difficult to develop traditional controllers because chaotic systems are very sensitive to small changes.”

These issues are important in situations where milliseconds can mean the difference between life and death, such as when a self-driving car must decide to brake to prevent an accident, he said.

This research recently nature communications.

Advances in machine learning architectures

The team's digital twin, which can be balanced on a fingertip and is compact enough to fit on an inexpensive computer chip that can run without an internet connection, is built to optimize the efficiency and performance of the controller, and the researchers found that it leads to reduced power consumption. It was primarily trained using a type of machine learning approach called reservoir computing, which makes it very easy to accomplish this.

“The great thing about the machine learning architecture we used is that it's very good at learning the behavior of a system as it evolves over time,” Kent says. “It was inspired by how connections are created in the human brain.”

Practical application and future direction

Research shows that similar-sized computer chips are used in devices such as smart refrigerators, but this new computing power will make new models better able to handle dynamic systems such as self-driving cars and heart monitors. It is said to be equipped with particularly excellent equipment. It can quickly adapt to the patient's heart rate.

“Large machine learning models need to consume a lot of power to process data and derive appropriate parameters, but our model and training are so simple that we can learn the system on the fly. ” he said.

To test this theory, the researchers asked the model to complete a complex control task and compared its results with those of previous control techniques. Research revealed that their approach achieved higher levels of outcomes. Accuracy It is easier to handle the task than its linear counterpart and has significantly lower computational complexity than previous machine learning-based controllers.

“In some cases, the improvement in accuracy was quite noticeable,” Kent says. Results showed that their algorithm requires more energy to operate than a linear He controller, but this tradeoff means that the team's model lasts longer on power-up and is currently This means that it is significantly more efficient than commercially available machine learning-based controllers.

“People will put it to good use based on how efficient it is,” Kent says. “It can be implemented on almost any platform and is very easy to understand.” This algorithm recently became available to scientists.

Economic and environmental considerations

Besides stimulating potential advances in engineering, there are equally important economic and environmental incentives to create more power-friendly algorithms, Kent said.

As society becomes more reliant on computers and AI in nearly every aspect of daily life, demand for data centers is skyrocketing, and many experts are considering the enormous power demands of digital systems and the future to keep up. We are concerned about what industry needs to do.

And because the construction of these data centers and large-scale computing experiments can generate large carbon footprints, scientists are looking for ways to curb carbon emissions from this technology. .

To advance the results, future research will likely be directed toward training models to explore other applications such as quantum information processing, Kent said. In the meantime, he expects these new elements to permeate the scientific community widely.

“Not enough people in industry and engineering know about these kinds of algorithms, and one of the big goals of this project is to get more people to know about them,” said Kent. Masu. “This work is a great first step towards reaching that potential.”

References: “Controlling Chaos with Edge Computing Hardware” by Robert M. Kent, Wendson AS Barbosa, and Daniel J. Gauthier, May 8, 2024. nature communications.
DOI: 10.1038/s41467-024-48133-3

This research was supported by the U.S. Air Force Office of Scientific Research. Other co-authors from Ohio State University include Wendson AS Barbosa and Daniel J. Gauthier.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *