Neuromorphic AI powers efficient autonomous drone flight

AI News


summary: Researchers have developed an autonomous drone using neuromorphic imaging that mimics animal brains. This method significantly increases data processing speed and energy efficiency compared to traditional GPUs.

This study highlights the potential for small, agile drones to be used in a variety of applications. The neuromorphic approach allows drones to process data up to 64 times faster while consuming three times less energy.

Important facts:

  1. efficient processing: Neuromorphic AI processes data up to 64x faster than GPUs.
  2. energy saving: This system consumes one third of the energy of traditional methods.
  3. real world applications: Possible uses include crop monitoring and warehouse management.

sauce: Delft University of Technology

A research team at Delft University of Technology has developed an autonomous drone using neuromorphic imaging and control based on how the animal brain works.

Animal brains use less data and energy compared to current deep neural networks running on GPUs (graphics chips). Neuromorphic processors are therefore very suitable for small drones, as they do not require heavy and bulky hardware or batteries.

The results are astonishing: During flight, the drone's deep neural network processes data up to 64 times faster than when running on GPUs, consuming three times less energy. Further developments in this technology could allow drones to make the leap to becoming as small, agile, and smart as flying insects and birds.

This shows a drone.
Together, they can form a huge enabler for autonomous robots, especially small and agile robots like flying drones.Credit: Neuroscience News

The results of this study were recently science robotics.

Learning from animal brains: spiking neural networks

Artificial intelligence has great potential to provide autonomous robots with the intelligence needed for real-world applications. However, current AI relies on deep neural networks, which require significant computing power.

The processors built to run deep neural networks (graphics processing units, GPUs) consume significant amounts of energy. This is a problem, especially for small robots like flying drones, which can only carry very limited resources in terms of sensing and computing.

Animal brains process information very differently than neural networks running on GPUs.

Biological neurons process information asynchronously and communicate primarily through electrical pulses called . spike. Sending these spikes takes energy, so the brain minimizes them and processes them sparsely.

Inspired by these properties of animal brains, scientists and technology companies are developing new ones. neuromorphic processor.

These new processors will enable the execution of spiking neural networks, which are expected to be faster and more energy efficient.

“The computations performed by a spiking neural network are much simpler than those of a standard deep neural network,” says Jesse Hageners, a doctoral candidate and one of the authors of this article. says Mr. Multiply and add floating point numbers.

“This allows the neural network to spike faster and be more energy efficient. To understand why, humans are better able to compute 5 + 8 than 6.25 x 3.45 + 4.05 x 3.45. Think of it as much easier.”

This energy efficiency is further improved when a neuromorphic processor is used in conjunction with a neuromorphic sensor, such as a neuromorphic camera. Such cameras do not produce images at regular time intervals. Instead, each pixel only sends a signal when it gets brighter or darker.

The advantages of such cameras are that they can recognize movement more quickly, are more energy efficient, and work well in both dark and bright environments.

Furthermore, the signal from the neuromorphic camera can be directly input into a spiking neural network running on the neuromorphic processor. Together, they can form a huge enabler for autonomous robots, especially small and agile robots like flying drones.

First neuromorphic vision and control of flying drones

In an article published in science robotics On May 15, 2024, researchers at Delft University of Technology in the Netherlands demonstrated for the first time a drone that uses neuromorphic vision and control for autonomous flight.

Specifically, they developed a spike neural network that processes signals from a neuromorphic camera and outputs control commands that determine the drone's attitude and thrust.

They deployed this network on Intel's Loihi neuromorphic research chip, a neuromorphic processor installed in the drone. Thanks to the network, the drone can recognize and control its movement in any direction.

“We faced many challenges,” says Federico Paredes Valles, one of the researchers involved in the study. “But the hardest part was imagining it.” how You can also train a spiking neural network so that both trainings are fast enough. and The trained network will also work well on real robots. In the end, we designed a network consisting of two modules.

“The first module learns to visually recognize motion from the signals of a moving neuromorphic camera. This is similar to how animals learn to perceive the world on their own.

“The second module learns how to map motion estimated in the simulator to control commands. This learning relies on artificial evolution in the simulation, with networks that are better at controlling the drone producing descendants. More likely.

“Through generations of artificial evolution, the spiking neural networks became better and better at controlling, eventually being able to fly in all directions and at different speeds.

“We trained both modules and developed a method that allowed them to be merged. We were pleased to see that the combined network worked well on real robots right away.”

Equipped with neuromorphic vision and control, this drone can fly at different speeds under different light conditions, from dark to bright. Because it can also fly with flashing lights, the neuromorphic camera's pixels send a large amount of signals to the network that are unrelated to movement.

Improving energy efficiency and speed with neuromorphic AI

“Importantly, our measurements confirm the potential of neuromorphic AI. The network runs on average between 274 and 1600 times per second. The same network ran on a smaller embedded GPU. On average, it runs only 25 times per second, a difference of up to 10 to 64 times.

“Intel's Loihi neuromorphic research chip consumes 1.007 watts, of which 1 watt is idle power consumed by the processor when the chip is powered on. The operating cost of the network itself is only 7 milliwatts. By comparison, when running the same network, the embedded GPU consumes 3 watts of which 1 watt is idle power and 2 watts are spent running the network.

“Neuromorphic approaches allow AI to run faster and more efficiently, enabling deployment to smaller autonomous robots,” said Stein Strobanz, a PhD candidate in the field of neuromorphic drones. I am.

Future applications of neuromorphic AI for small robots

“Neuromorphic AI allows all autonomous robots to become more intelligent,” says Professor Guido de Kroon of BioInspiration Drones.

“At Delft University of Technology's Department of Aerospace Engineering, we are working on the development of small autonomous drones that can be used for a variety of applications, from monitoring crops in greenhouses to tracking inventory in warehouses.

“The advantage of small drones is that they are very safe and can navigate in tight environments, such as between tomato plants. Moreover, they are very cheap, so they can be deployed in swarms.

“As we have shown in exploration and gas source locating settings, this helps cover areas more quickly.”

“Current efforts are a great step in this direction. However, the ability to realize these applications will depend on further scaling down neuromorphic hardware and extending its capabilities for more complex tasks such as navigation. It depends on whether you can do it or not.”

About this AI/Robotics research news

author: mark cool
sauce: Delft University of Technology
contact: Mark Kuhl – Delft University of Technology
image: Image credited to Neuroscience News

Original research: Closed access.
“Fully neuromorphic vision and control for autonomous drone flight” Guido de Croon et al. science robotics


abstract

Fully neuromorphic vision and control for autonomous drone flight

Biological sensing and processing are asynchronous and sparse, resulting in low-latency and energy-efficient perception and action.

In robotics, neuromorphic hardware for event-based vision and spiking neural networks is expected to exhibit similar properties.

However, the network size limitations of current embedded neuromorphic processors and the difficulty of training spiking neural networks limit robot implementations to basic tasks involving low-dimensional sensory input and motor movements. I'm here.

Here we present a fully neuromorphic vision-to-control pipeline for controlling flying drones. Specifically, we trained a spiking neural network that takes raw event-based camera data and outputs low-level control actions to perform autonomous vision-based flight.

The visual part of the network consists of five layers and 28,800 neurons that map incoming raw events to ego motion estimates and was trained with self-supervised learning on real event data.

The control part consists of a single decoding layer and was trained with an evolutionary algorithm in a drone simulator. Robotic experiments demonstrate successful simulation-to-realistic transfer of a fully trained neuromorphic pipeline.

The drone could precisely control its own motion and was able to hover, land, maneuver laterally, and even yaw at the same time.

The neuromorphic pipeline runs on Intel's Loihi neuromorphic processor at an execution frequency of 200 Hertz, consumes 0.94 watts of idle power, and consumes only an additional 7 to 12 milliwatts during network execution.

These results demonstrate the potential of neuromorphic sensing and processing to realize insect-sized intelligent robots.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *