My colleagues and I have developed an ultrasonic-based system derived from bat echolocation to help aerial robots in low visibility environments navigate.
Robots today rely on either cameras, light detection and range, also known as Lidar, or both. These sensors are not effective in conditions that can be difficult to see, like smoke, fog or snow.
My job is to develop bio-inspired robots. My research team studied bats, who are experts in navigating with poor visibility.
Bats can survive in damp, dark caves. They use echolocation to detect objects as small as human hairs. The echolocators emit soundwaves and then listen for weak reflections from the objects.
This is because the propellers make a great deal of noise. This is like listening to your friend as a jet engine takes off right next to you.
We present two ideas to overcome this problem. A physical shield inspired by the cartilage of bats’ ears reduces noise from propellers around the sensors that act as the robots’ ears. Saranga, an artificial neural network, recovers the weak echo signals in noisy measurements. It does this by analyzing patterns that are learned over time.
This is inspired by bats’ sound processing.
The robot can estimate the location of obstacles in 3D using these sensors and safely navigate with milliwatt level sensing power.
In a simulation of snowfall, the drone maneuvers itself around a simulated obstacle. Nitin Sanket
What it means
They are useful in search and rescue operations, particularly when they’re used to operate within confined and dynamic environments. These operations are often carried out in areas with poor visibility, like forest fires, collapsed structures, caves, or dusty conditions. In such scenarios, sensors like lidar and cameras can become less reliable.
Echolocation is the primary way bats perceive their environment. They do not use vision alone. The ability to detect ultrasonic signals is not dependent on the lighting and can be used in darkness, smoke and dust.
Our research shows it’s possible to achieve this feat with aerial robots, despite the loud propeller noise.
Sonar enhanced by noise shielding, machine learning and other technologies will enable small low-cost robotic systems to operate where existing technology fails.
The research could lead to highly autonomous and functional tiny drones that can be used for humanitarian purposes such as cave exploration, search and rescue and combating poaching. AI-enabled navigation using sonar could result in safer, more efficient and cost-effective robots that can be used for urgent operations when human access or helicopters are limited. It is an important step towards the deployment of aerial robots in swarms to search for survivors and explore dangerous environments.
These drones will be able to perform other applications that require low power, like environmental monitoring, thanks to breakthroughs in neural networks, mathematical modeling and sensor characterization. We can cut power 1,000-fold, weight 10x and costs 100 times when compared with current solutions.
Other research being conducted
In low-visibility conditions, most aerial navigation systems are based on depth sensors, cameras or lidar.
These technologies degrade. In these conditions, radar is effective but requires a lot of power for smaller drones. Previous work explored ultrasound sensors mainly for ground robots. However, applying them to aerial robots was difficult because of propeller noise.
Next steps
Our team is working to improve the flying speed, range of sensing and size of our system.
Also, we are exploring bio-inspired design and the combination of ultrasound with other forms of sensing.
Our ultimate goal is to develop reliable and low-power aerial robotics capable of operating reliably and efficiently in dynamic environments, and enabling real-world deployment for search and rescue.
Research Briefs are a concise summary of interesting research.


