Bat-Inspired AI Gives Drones Ultrasonic Sight
Research supported by the U.S. Army has developed a synthetic echolocation system that blends biology-inspired engineering and artificial intelligence (AI). This new sensor system allows machines to “see” in complete darkness without cameras, radar, or LIDAR.
By taking cues from bats and dolphins, this state-of-the-art technology has the potential to transform how military drones, autonomous vehicles, and robots navigate and identify objects in challenging environments where traditional sensors falter.
“Drawing inspiration from the biological phenomenon of echolocation, ultrasound perception holds immense potential across various engineering domains, spanning from advanced imaging to precise navigation,” researchers write. “Despite advances in sensor development and signal processing, current methodologies struggle to match the remarkable perceptual acuity of echolocating animals when deciphering real-world ultrasound echoes.”
The new system, developed at the University of Michigan with funding from the Army Research Office and the DEVCOM Ground Vehicle Systems Center, employs neural networks trained entirely on simulated data.
This approach allows the AI to classify objects based solely on how they scatter ultrasonic pulses, much like how a bat identifies prey in the dead of night. The study, accepted for publication in the December 2025 edition of the Journal of Sound and Vibration, marks a breakthrough in artificial perception, opening doors for robust navigation and detection in low-visibility conditions such as fog, smoke, or cluttered battlefields.
Echolocation, the biological sonar used by animals like bats and whales, has long fascinated scientists. These creatures emit high-frequency pulses and analyze the returning echoes to build detailed mental maps of their surroundings. For decades, engineers have tried to replicate this remarkable ability in machines. But despite advances in sensors and signal processing, no artificial system has come close to matching the perceptual precision of nature’s sonar masters—until now.
The research team tackled one of the thorniest challenges in artificial echolocation: teaching machines to understand and classify real-world echoes without requiring vast amounts of experimental data, which are often expensive, time-consuming, and logistically challenging to collect.
Instead of relying on real-world recordings to train their neural networks, the team used sophisticated numerical simulations. They created virtual echoes by modeling how ultrasonic waves scatter off objects of various shapes—cubes, spheres, and cylinders—in a digital 3D environment.
These simulated echoes were then fed into an ensemble of convolutional neural networks (CNNs), with each network fine-tuned to detect one specific shape. The researchers further enhanced the synthetic data by adding realistic distortions: variations in amplitude, phase shifts, and background noise that occur in real-world conditions. This ensured that the AI models would not be thrown off by the messy, unpredictable nature of actual acoustic environments.
When tested on physical objects in laboratory experiments, the AI could use echolocation to correctly classify shapes with impressive accuracy, even when those objects produced echoes that would seem almost identical to a human or conventional computer.
For instance, the system consistently distinguished between spheres and cylinders despite their similarly curved surfaces generating overlapping acoustic patterns.
According to the study, the technology has shown resilience to object orientation, distance, and minor manufacturing imperfections. The neural networks consistently performed well in tests where objects were rotated or offset, highlighting their potential for real-world deployment.
This development’s reliance on sound rather than light or electromagnetic waves makes it particularly promising for defense applications. Cameras and LIDAR systems are vulnerable to visual obstructions like darkness, smoke, or dust—conditions common on the battlefield or in disaster zones.
By contrast, ultrasonic waves can penetrate these barriers, offering a reliable means of perception when other sensors go blind. The Army’s interest in this research points to potential uses in autonomous ground vehicles, aerial drones, or even underwater systems where GPS and optical sensors are unreliable or ineffective.
Neural networks also offer a modular, scalable solution. The architecture allows new shapes or object types to be added by training additional specialized networks without overhauling the entire system.
This flexibility mirrors how animals like bats gradually learn to recognize new types of prey or obstacles in their environment. Such adaptability could prove vital for autonomous military systems expected to operate in dynamic and unpredictable settings, providing a sense of reassurance about its potential in challenging conditions.

The implications of this breakthrough extend beyond military uses. The researchers envision a wide range of applications, including medical imaging, search and rescue operations, industrial inspection, and underwater exploration. Essentially, anywhere machines need to sense their surroundings without relying on vision, this technology could be a game-changer.
In particular, the method’s ability to perform well using only synthetic training data could dramatically reduce development time and cost for future ultrasound-based technologies.
Yet, challenges remain. The system struggled to classify objects with lower symmetry—such as cubes aligned directly toward the source—when their echoes closely resembled those of other shapes.
The researchers noted that future models could benefit from training on a more diverse set of object orientations and from additional data augmentation to simulate extreme conditions. They also suggest that, like echolocating animals, machines equipped with this technology could emit pulses from multiple angles and synthesize the results to improve accuracy.
Ultimately, the study showcases the latest example of bioinspired engineering aimed at fielding more intelligent, adaptable autonomous systems.
As the U.S. Army continues to explore new technologies for next-generation warfare and logistics, innovations like artificial echolocation could provide a strategic edge where traditional sensors fall short.
“Overall, our framework provides a platform for advancing ultrasound perception by drawing inspiration from the strategies employed by echolocating animals,” researchers concluded. “By aligning artificial perception models with principles observed in biological systems, this work contributes to narrowing the gap between engineered and biological perception.”
Source: The Debrief