コウモリに着想を得た飛行ロボットの進展(Bat-Inspired Advance to Aerial Robots)

2026-03-25 ウースター工科大学(WPI)

Worcester Polytechnic Instituteの研究チームは、コウモリの飛行メカニズムに着想を得た新しい空中ロボット技術を開発した。コウモリの柔軟な翼構造や羽ばたき運動を解析し、それを模倣したロボットは、従来の固定翼や回転翼ドローンよりも高い機動性と安定性を実現する。特に狭い空間や複雑な環境での飛行性能が向上し、災害現場の探索や監視などへの応用が期待される。また、この研究は生物の運動原理を工学設計に応用するバイオミメティクスの有効性を示しており、次世代ロボット開発の新たな方向性を提示している。

コウモリに着想を得た飛行ロボットの進展(Bat-Inspired Advance to Aerial Robots)
An aerial robot developed in the lab of Nitin J. Sanket navigates past trees.

<関連情報>

手のひらサイズの空中ロボットにおける、視覚的に劣悪な環境下でのナビゲーションのためのミリワット級超音波 Milliwatt ultrasound for navigation in visually degraded environments on palm-sized aerial robots

Manoj Velmurugan, Phillip Brush, Colin Balfour, Richard J. Przybyla, and Nitin J. Sanket
Science Robotics  Published:25 Mar 2026
DOI:https://doi.org/10.1126/scirobotics.adz9609

Abstract

Tiny palm-sized aerial robots have exceptional agility and cost-effectiveness in navigating confined and cluttered environments. However, their limited payload capacity directly constrains the sensing suite onboard the robot, thereby limiting critical navigational tasks in Global Positioning System (GPS)–denied wild scenes. Common methods for obstacle avoidance use cameras and light detection and ranging (LIDAR), which become ineffective under visually degraded conditions such as low visibility, dust, fog, or darkness. Other sensors, such as radio detection and ranging (RADAR), have high power consumption, making them unsuitable for tiny aerial robots. Inspired by bats, we propose Saranga, a low-power, ultrasound-based perception stack that localizes obstacles using a dual sonar array. We present two key solutions to combat the low peak signal-to-noise ratio of −4.9 decibels: physical noise reduction and a deep learning–based denoising method. First, we present a practical way to block propeller-induced ultrasound noise on the weak echoes. The second solution is to train a neural network to use the long horizon of ultrasound echoes for finding signal patterns under high amounts of uncorrelated noise where classical methods were insufficient. We generalized to the real world by using a synthetic data generation pipeline augmented with limited real noise data for training. We enabled a palm-sized aerial robot to navigate under visually degraded conditions of dense fog, darkness, and snow in a cluttered environment with thin and transparent obstacles using only onboard sensing and computation. We provide extensive real-world results to demonstrate the efficacy of our approach.

0109ロボット
ad
ad
Follow
ad
タイトルとURLをコピーしました