J. Field Robotics | 2021

The MADMAX data set for visual-inertial rover navigation on Mars

 
 
 
 
 
 
 
 
 
 
 
 

Abstract


Funding information MOdulares Robotisches EXplorationssystem (MOREX) Abstract Planetary rovers increasingly rely on vision‐based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human‐portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource‐efficient field testing and make the resulting Morocco‐Acquired data set of Mars‐Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time‐stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real‐time kinematic‐based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state‐of‐the‐art navigation algorithms, ORB‐SLAM2 and VINS‐mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at https://rmc.dlr.de/morocco2018.

Volume 38
Pages 833-853
DOI 10.1002/ROB.22016
Language English
Journal J. Field Robotics

Full Text