2026-02-18 NASA

This panorama from Perseverance is composed of five stereo pairs of navigation camera images that the rover matched to orbital imagery in order to pinpoint its position on Feb. 2, 2026, using a technology called Mars Global Localization.NASA/JPL-Caltech
<関連情報>
- https://www.nasa.gov/missions/mars-2020-perseverance/perseverance-rover/nasas-perseverance-now-autonomously-pinpoints-its-location-on-mars/
- https://www-robotics.jpl.nasa.gov/media/documents/2024_Global_Localization_ICRA.pdf
- https://ieeexplore.ieee.org/document/10611697
Censible: 惑星表面探査ミッションのための堅牢かつ実用的なグローバル位置推定フレームワーク Censible: A Robust and Practical Global Localization Framework for Planetary Surface Missions
Jeremy Nash; Quintin Dwight; Lucas Saldyt; Haoda Wang; Steven Myint; Adnan Ansar,…
2024 IEEE International Conference on Robotics and AutomationDate Added to IEEE Xplore: 08 August 2024
DOI:https://doi.org/10.1109/ICRA57147.2024.10611697
Abstract
To achieve longer driving distances, planetary robotics missions require accurate localization to counteract position uncertainty. Freedom and precision in driving allows scientists to reach and study sites of interest. Typically, rover global localization has been performed manually by humans, which is accurate but time-consuming as data is relayed between planets. This paper describes a global localization algorithm that is run onboard the Perseverance Mars rover. Our approach matches rover images to orbital maps using a modified census transform to achieve sub-meter accurate, near-human localization performance on a real dataset of 264 Mars rover panoramas. The proposed solution has also been successfully executed on the Perseverance Mars Rover, demonstrating the practicality of our approach.


