動物の脳からヒントを得た自律型ロボットのAIゲームチェンジャー(Animal brain inspired AI game changer for autonomous robots)

ad

2024-05-15 オランダ・デルフト工科大学(TUDelft)

動物の脳からヒントを得た自律型ロボットのAIゲームチェンジャー(Animal brain inspired AI game changer for autonomous robots)Photo of the “neuromorphic drone” flying over a flower pattern. It illustrates the visual inputs the drone receives from the neuromorphic camera in the corners. Red indicates pixels getting darker, green indicates pixels getting brighter.

人工知能(AI)は自律ロボットに必要な知能を提供する可能性を持ちますが、現在のAIは大量の計算資源を要する深層神経ネットワークに依存しています。動物の脳は非同期に情報を処理し、スパイクと呼ばれる電気パルスで通信するため、エネルギー消費が少なくなっています。この特性に着想を得て、科学者たちはニューロモルフィックプロセッサを開発し、スパイキングニューラルネットワーク(SNN)を高速かつ省エネで動作可能にしています。デルフト工科大学の研究者は、ニューロモルフィックカメラとSNNを搭載したドローンを開発し、自律飛行を実現しました。ニューロモルフィックカメラは明暗変化時にのみ信号を送るため、エネルギー効率が高く、暗所や明所でも優れた性能を発揮します。この技術により、小型自律ロボットがさまざまな用途で利用可能になります。

<関連情報>

ドローンの自律飛行のための完全なニューロモルフィック・ビジョンと制御 Fully neuromorphic vision and control for autonomous drone flight

F. PAREDES-VALLÉS, J. J. HAGENAARS, J. DUPEYROUX, S. STROOBANTS, […], AND G. C. H. E. DE CROON
Science Robotics  Published:15 May 2024
DOI:https://doi.org/10.1126/scirobotics.adi0591

Editor’s summary

Despite the ability of visual processing enabled by artificial neural networks, the associated hardware and large power consumption limit deployment on small flying drones. Neuromorphic hardware offers a promising alternative, but the accompanying spiking neural networks are difficult to train, and the current hardware only supports a limited number of neurons. Paredes-Vallés et al. now present a neuromorphic pipeline to control drone flight. They trained a five-layer spiking neural network to process the raw inputs from an event camera. The network first estimated ego-motion and subsequently determined low-level control commands. Real-world experiments demonstrated that the drone could control its ego-motion to land, hover, and maneuver sideways, with minimal power consumption. —Melisa Yashinski

Abstract

Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks. Here, we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone. Specifically, we trained a spiking neural network that accepts raw event-based camera data and outputs low-level control actions for performing autonomous vision-based flight. The vision part of the network, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. The control part consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways—even while yawing at the same time. The neuromorphic pipeline runs on board on Intel’s Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network. These results illustrate the potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots.

0109ロボット
ad
ad
Follow
ad
タイトルとURLをコピーしました