Army funding research will develop AI that uses echolocation to help drones “see” in bats like darkness.

Applications of AI


The US Army-backed research has developed a synthetic echolocation system that blends biology-inspired engineering and artificial intelligence (AI). This new sensor system allows the machine to “see” in the complete darkness without a camera, radar or rider.

By taking clues from bats and dolphins, this cutting-edge technology could change how military drones, autonomous vehicles and robots navigate and identify objects in challenging environments where traditional sensors are loose.

“Based on inspiration from the biological phenomena of echolocation, ultrasound perception has great potential across engineering domains ranging from advanced imaging to accurate navigation,” the researchers wrote. “Despite advances in sensor development and signal processing, current methodologies struggle to match the remarkable perceptual vision of echo-measuring animals when deciphering actual ultrasound echoes.”

The new system, developed at the University of Michigan, employs a neural network that is fully trained with simulated data, funded by the Army Research Office and the Devcom Ground Vehicle Systems Center.

This approach allows AI to classify objects based solely on how they scatter ultrasound pulses, just as how bats identify prey late at night. This study was published in the December 2025 edition. Journal of Sound and Vibrationmark breakthroughs in artificial perception and open doors for robust navigation and detection in visually impaired conditions such as fog, smoke, or messy battlefields.

Echolocation, a biological sonar used by animals such as bats and whales, has long fascinated scientists. These creatures emit high frequency pulses and analyze the returning echoes to build a detailed mental map of their surroundings. For decades, engineers have been trying to replicate this incredible ability with machines. However, despite advances in sensors and signal processing, no artificial system has ever been consistent with the perceptual accuracy of natural sonar masters.

The research team tackled one of the worst challenges of artificial echolocation. We worked on teaching machines to understand and classify real-world echoes without the need for a large amount of experimental data.

Instead of relying on real-world recordings to train neural networks, the team used sophisticated numerical simulations. They created virtual echoes by modeling how objects of various shapes (cubes, spheres, cylinders) scatter ultrasound waves in a digital 3D environment.

These simulated echoes were then fine-tuned for each network to detect one particular shape and fed into an ensemble of convolutional neural networks (CNNS). Researchers further enhanced the synthetic data by adding realistic distortions: variations in amplitude, phase shifts, and background noise that occur under real conditions. This prevents AI models from being dumped by the messy and unpredictable nature of the actual acoustic environment.

When tested with physical objects in laboratory experiments, AI can use echolocation to correctly classify shapes with impressive accuracy, even when generating echoes that look almost identical to humans or traditional computers.

For example, the system consistently distinguishes between spheres and cylinders, despite similarly curved surfaces producing overlapping acoustic patterns.

Research shows that the technique demonstrates object orientation, distance, and resilience to minor manufacturing imperfections. Neural networks consistently worked well in tests where objects were rotated or offset, highlighting the real deployment potential.

This development is particularly promising for defense applications as it relies on sound rather than light or electromagnetic waves. The camera and rider system are vulnerable to visual impairments such as darkness, smoke and dust. This is a common condition for battlefields and disaster zones.

In contrast, ultrasound waves penetrate these barriers and provide reliable perceptual means of blindness when other sensors become blind. The Army's interest in this study points to potential applications in underwater systems where autonomous ground vehicles, aviation drones, and even GPS and optical sensors are unreliable or ineffective.

Neural networks also offer modular, scalable solutions. This architecture allows new shapes or object types to be added by training additional specialized networks without overhauling the entire system.

This flexibility reflects the way animals, such as bats, gradually learn to recognize new types of prey and obstacles in their environment. Such adaptability proves essential for autonomous military systems that are expected to operate in dynamic, unpredictable settings, providing a sense of security about their potential in challenging situations..


Gulf Stream



The meaning of this breakthrough goes beyond military use. Researchers envision a wide range of applicationsmedical imaging, search and rescue operations, industrial inspections, underwater exploration, and more. Essentially, this technology can be a game changer where the machine needs to feel its surroundings without relying on vision.

In particular, the ability to work well using only synthetic training data can dramatically reduce the development time and cost of future ultrasound-based technologies.

However, the challenges remain. This system struggled to classify objects with lower symmetry, such as cubes aligned directly to the source, when the echoes were similar to other shapes of echoes.

Researchers noted that future models could benefit from training on more diverse object orientations and from additional data augmentation to simulate extreme conditions. They also suggest that, like echo-measurement animals, machines equipped with this technology emit pulses from multiple angles, synthesizing results to improve accuracy.

Finally, this study presents the latest examples Bioinspiration Engineering It aims to field more intelligent and adaptive autonomous systems.

As the US military continues to explore new technologies for next-generation warfare and logistics, innovations like artificial echolocation could provide a strategic edge where traditional sensors fall on Short.

“Overall, our framework provides a platform for advancing ultrasound perception by drawing inspiration from strategies employed by echo measurements,” the researchers concluded. “By adjusting the principles and artificial perception models observed in biological systems, this work contributes to narrowing the gap between engineering and biological perception.. ”

Tim McMillan is a retired law enforcement executive, investigative reporter and co-founder of DeBrief. His writing usually focuses on topics related to defense, national security, intelligence reporting community and psychology. You can follow Tim on Twitter: @lttimmcmillan. Tim can be contacted by email: tim@thedebrief.org or encrypted email: lttimmcmillan@protonmail.com





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *