2024-07-17 オランダ・デルフト工科大学(TUDelft)
<関連情報>
- https://www.tudelft.nl/en/2024/lr/ant-insights-lead-to-robot-navigation-breakthrough
- https://www.science.org/doi/10.1126/scirobotics.adk0310
小さな自律ロボットのための視覚的経路追跡 Visual route following for tiny autonomous robots
TOM VAN DIJK , CHRISTOPHE DE WAGTER, AND GUIDO C. H. E. DE CROON
Science Robotics Published:17 Jul 2024
DOI:https://doi.org/10.1126/scirobotics.adk0310
Editor’s summary
For navigation, robots often rely either on external infrastructure or on mapping-based navigation algorithms that are generally associated with high computational demands. van Dijk et al. have now developed an insect-inspired approach for visual navigation of resource-constrained miniature drones by exploiting both visual homing and odometry. The framework was tested on a 56-g drone, which was shown to be capable of following routes in indoor environments over long distances with minimal memory requirements. This approach could substantially improve the autonomous navigation of aerial robots with constraints in computational power, energy demands, and weight. —Amos Matsiko
Abstract
Navigation is an essential capability for autonomous robots. In particular, visual navigation has been a major research topic in robotics because cameras are lightweight, power-efficient sensors that provide rich information on the environment. However, the main challenge of visual navigation is that it requires substantial computational power and memory for visual processing and storage of the results. As of yet, this has precluded its use on small, extremely resource-constrained robots such as lightweight drones. Inspired by the parsimony of natural intelligence, we propose an insect-inspired approach toward visual navigation that is specifically aimed at extremely resource-restricted robots. It is a route-following approach in which a robot’s outbound trajectory is stored as a collection of highly compressed panoramic images together with their spatial relationships as measured with odometry. During the inbound journey, the robot uses a combination of odometry and visual homing to return to the stored locations, with visual homing preventing the buildup of odometric drift. A main advancement of the proposed strategy is that the number of stored compressed images is minimized by spacing them apart as far as the accuracy of odometry allows. To demonstrate the suitability for small systems, we implemented the strategy on a tiny 56-gram drone. The drone could successfully follow routes up to 100 meters with a trajectory representation that consumed less than 20 bytes per meter. The presented method forms a substantial step toward the autonomous visual navigation of tiny robots, facilitating their more widespread application.