In the vast skies once ruled by birds, new aviators are taking flight. These air pioneers are not creatures, but drones, the products of deliberate innovation. But these aren’t your typical flying bots humming like mechanical bees. Rather, they are soaring bird-inspired wonders, guided by liquid neural networks to navigate ever-changing unseen environments with precision and ease.
Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) are inspired by the adaptability of the organic brain to enable a robust flight navigation agent to master vision-based flight target tasks in complex and unfamiliar environments. introduced a method. Liquid neural networks that can continuously adapt to new data inputs have shown their ability to make reliable decisions in unknown domains such as forests, cityscapes, and environments with added noise, rotation, and occlusion. These adaptable models may outperform many state-of-the-art counterparts in navigation tasks and enable potential real-world drone applications such as search and rescue, delivery, and wildlife surveillance. I have.
Researcher’s Recent Work Released Today science robotics, detailing how this new class of agents can adapt to drastic distributional changes, a long-standing challenge in the field. But the team’s new class of machine learning algorithms captures the causal structure of tasks from high-dimensional unstructured data, such as pixel inputs from cameras mounted on drones. These networks are able to extract important aspects of the task (i.e., understand the task at hand) and ignore irrelevant features, thus enabling acquired navigational skills to seamlessly transfer targets to new environments. You will be able to
play video
Drones use liquid neural networks to navigate unseen environments.
“We are excited about the immense potential of a learning-based control approach to robots. to lay the groundwork for the future,” said CSAIL’s Daniela Rus. Director and Andrew (1956) and her Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT. “In our experiments, we can effectively train the drone to locate objects in the forest in the summer, and then deploy the model in the winter. However, even in an urban environment, it can perform various tasks such as seek and follow.This adaptability is made possible by the causal foundation of our solution.These flexible algorithms are ideal for medical diagnosis and autonomous driving. It can help make decisions based on data streams that change over time, such as applications.”
Difficult challenges were at the forefront. Does the machine learning system understand the task given by the data when flying a drone to an unlabeled object? And then into a new environment where the scenery changes dramatically, such as flying from a forest to an urban landscape? , can the learned skills and tasks be transferred? Moreover, unlike the amazing ability of our biological brains, deep learning systems struggle to capture causality and often overwhelm training data. Unable to adapt and adapt to new environments and changing conditions. This is especially troublesome for embedded systems with limited resources, such as aerial drones that need to traverse different environments and react to obstacles instantly.
In contrast, liquid networks provide promising preliminary indications of the ability to address this critical weakness of deep learning systems. The team’s systems were first trained on data collected by human pilots to see how her acquired navigation skills would translate to new environments in dramatically changing landscapes and situations. confirmed. Unlike traditional neural networks, which only learn during the training phase, the parameters of Liquid Neural Networks can change over time, making them interpretable as well as unpredictable or noisy data. improve resilience to
In a series of quadrotor closed-loop control experiments, the drone underwent range tests, stress tests, target rotation and occlusion, enemy hikes, triangular loops between objects, and dynamic target tracking. They track moving targets, perform multi-step loops between objects in unprecedented environments, and outperform other state-of-the-art comparables.
The team’s ability to learn from limited expert data and generalize to new environments while understanding specific tasks will make autonomous drone deployments more efficient, cost-effective and reliable. I think it will be They noted that liquid neural networks could enable the use of autonomous airborne drones for environmental monitoring, package delivery, self-driving cars, and robotic assistants.
Ramin Hasani, MIT CSAIL Research Affiliate, said: “For the more complex inference challenges of AI systems in autonomous navigation applications, there is still a lot of room for future research and development, which needs to be tested before it can be safely deployed in society.”
“Robust learning and performance on non-distributed tasks and scenarios are some of the key problems that machine learning and autonomous robotic systems must overcome to further advance into critical applications for society.” , said Professor Alessio Lomuscio. Department of Computing, Imperial College London. “In this context, the performance reported in this study of liquid neural networks, a new brain-inspired paradigm developed by the authors at MIT, is noteworthy. If confirmed in , the paradigms developed here will contribute to making AI and robotic systems more reliable, robust, and efficient.”
Clearly, the sky is no longer the limit, but a vast playground for endless possibilities for these flying wonders.
Hasani and PhD student Makram Chahine. Patrick Kao ’22, Meng ’22; PhD student Aaron Ray SM ’21 He wrote the thesis with Ryan Shubert ’20, MEng ’22. MIT postdocs Matthias Lechner and Alexander Amini. and Luz.
This work was supported in part by Schmidt Futures, USAF Research Laboratory, USAF Artificial Intelligence Accelerator, and Boeing.
