2026-03-25 ウースター工科大学(WPI)

An aerial robot developed in the lab of Nitin J. Sanket navigates past trees.
<関連情報>
- https://www.wpi.edu/news/bats-inspire-advance-aerial-robots
- https://www.science.org/doi/10.1126/scirobotics.adz9609
手のひらサイズの空中ロボットにおける、視覚的に劣悪な環境下でのナビゲーションのためのミリワット級超音波 Milliwatt ultrasound for navigation in visually degraded environments on palm-sized aerial robots
Manoj Velmurugan, Phillip Brush, Colin Balfour, Richard J. Przybyla, and Nitin J. Sanket
Science Robotics Published:25 Mar 2026
DOI:https://doi.org/10.1126/scirobotics.adz9609
Abstract
Tiny palm-sized aerial robots have exceptional agility and cost-effectiveness in navigating confined and cluttered environments. However, their limited payload capacity directly constrains the sensing suite onboard the robot, thereby limiting critical navigational tasks in Global Positioning System (GPS)–denied wild scenes. Common methods for obstacle avoidance use cameras and light detection and ranging (LIDAR), which become ineffective under visually degraded conditions such as low visibility, dust, fog, or darkness. Other sensors, such as radio detection and ranging (RADAR), have high power consumption, making them unsuitable for tiny aerial robots. Inspired by bats, we propose Saranga, a low-power, ultrasound-based perception stack that localizes obstacles using a dual sonar array. We present two key solutions to combat the low peak signal-to-noise ratio of −4.9 decibels: physical noise reduction and a deep learning–based denoising method. First, we present a practical way to block propeller-induced ultrasound noise on the weak echoes. The second solution is to train a neural network to use the long horizon of ultrasound echoes for finding signal patterns under high amounts of uncorrelated noise where classical methods were insufficient. We generalized to the real world by using a synthetic data generation pipeline augmented with limited real noise data for training. We enabled a palm-sized aerial robot to navigate under visually degraded conditions of dense fog, darkness, and snow in a cluttered environment with thin and transparent obstacles using only onboard sensing and computation. We provide extensive real-world results to demonstrate the efficacy of our approach.


