The Canadian Planetary Emulation Terrain Energy-Aware Rover Navigation Dataset

Citation Author(s):
Olivier
Lamarre
University of Toronto
Oliver
Limoyo
University of Toronto
Filip
Marić
University of Toronto
Jonathan
Kelly
University of Toronto
Submitted by:
Olivier Lamarre
Last updated:
Fri, 10/29/2021 - 10:27
DOI:
10.21227/fwqe-mc97
Data Format:
Links:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This is a unique energy-aware navigation dataset collected at the Canadian Space Agency’s Mars Emulation Terrain (MET) in Saint-Hubert, Quebec, Canada. It consists of raw and post-processed sensor measurements collected by our rover in addition to georeferenced aerial maps of the MET (colour mosaic, elevation model, slope and aspect maps). The data are available for download in human-readable format and rosbag (.bag) format. Python data fetching and plotting scripts and ROS-based visualization tools are also provided.

Instructions: 

The entire dataset is separated into six different runs, each covering different sections of the MET at different times. The data was collected on September 4, 2018 between 17:00 and 19:00 (Eastern Daylight Time). The data is available in both human-readable format and in rosbag (.bag) format.

To avoid extremely large files, the rosbag data of every run was broken down into two parts: “runX_clouds_only.bag” and “runX_base.bag”. The former only contains the point clouds generated from the omnidirectional camera raw images after data collection, and the latter contains all the raw data and the remainder of the post-processed data. Both rosbags possess consistent timestamps and can be merged together using bagedit for example. A similar breakdown was followed for the human-readable data.

Aside from point clouds, the post-processed data of every run includes a blended cylindrical panorama made from the omnidirectional sensor images, planar rover velocity estimates from wheel encoder data and an estimated global trajectory obtained by fusing GPS and stereo imagery coming from cameras 0 and 1 of the omnidirectional sensor using VINS-Fusion later combined with the raw IMU data. Global sun vectors and relative ones (with respect to the rover’s base frame) were also calculated using the Pysolar library. This library also provided clear-sky direct irradiance estimates along every pyranometer measurement collected. Lastly, the set of georeferenced aerial maps, the transforms between different rover and sensor frames, and the intrinsic parameters of each camera are also available.

We strongly recommend interested users to visit the project's home page, which provides additional information about each run (such as their physical length and duration). All download links on the home page were updated to pull from the IEEE DataPort servers. A more detailed description of the test environment and hardware configuration are provided in the project's official journal publication.

Once the data products of the desired run are downloaded, the project's Github repository provides a lightweight ROS package and python utilities to fetch the desired data streams from the rosbags.