Agri-EBV-autumn

Citation Author(s):
Andrejs
Zujevs
Riga Technical University, Latvia
Mihails
Pudzs
Riga Technical University, Latvia
Vitalijs
Osadcuks
Latvia University of Life Sciences and Technologies, Latvia
Arturs
Ardavs
Riga Technical University, Latvia
Maris
Galauskis
Riga Technical University, Latvia
Janis
Grundspenkis
Riga Technical University, Latvia
Submitted by:
Andrejs Zujevs
Last updated:
Tue, 09/07/2021 - 11:05
DOI:
10.21227/524a-fv77
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

A new generation of computer vision, namely event-based or neuromorphic vision, provides a new paradigm for capturing visual data and the way such data is processed. Event-based vision is a state-of-art technology of robot vision. It is particularly promising for use in both mobile robots and drones for visual navigation tasks. Due to a highly novel type of visual sensors used in event-based vision, only a few datasets aimed at visual navigation tasks are publicly available. Such datasets provide an opportunity to evaluate visual odometry and visual SLAM methods by imitating data readout from real sensors. This dataset is intended to cover visual navigation tasks for mobile robots navigating in different types of agricultural environment. The dataset might open new opportunities for the evaluation of existing and creation of new event-based visual navigation methods for use in agricultural scenes that contain a lot of vegetation, animals, and patterned objects. The new dataset was created using our own custom-designed Sensor Bundle, which was installed on a mobile robot platform. During data acquisition sessions, the platform was manually controlled in such environments as forests, plantations, cattle farm, etc. The Sensor Bundle consists of the dynamic vision sensor, a LIDAR, an RGB-D camera, and environmental sensors (temperature, humidity, and air pressure). The provided data sequences are accompanied by calibration data. The dynamic visual sensor, the LIDAR, and environmental sensors were time-synchronized with a precision of 1 us and time-aligned with an accuracy of +/- 1 ms. Ground-truth was generated by Lidar-SLAM methods. In total, there are 21 data sequences in 12 different scenarios for the autumn season. Each data sequence is accompanied by a video demonstrating its content and a detailed description, including known issues. The reported common issues include relatively small missing fragments of data and the RGB-D sensor's frame number sequence issues. The new dataset is mostly designed for Visual Odometry tasks, however, it also includes loop-closures for applying event-based visual SLAM methods. A.Zujevs is supported by the European Regional Development Fund within the Activity 1.1.1.2 “Post-doctoral Research Aid” of the Specific Aid Objective 1.1.1 (No.1.1.1.2/VIAA/2/18/334), while the others are supported by the Latvian Council of Science (lzp-2018/1-0482).

BibTex citation: @inproceedings{Zujevs2021, author = {Zujevs, Andrejs and Pudzs, Mihails and Osadcuks, Vitalijs and Ardavs, Arturs and Galauskis, Maris and Grundspenkis, Janis}, booktitle = {IEEE International Conference on Robotics and Automation}, title = {{A Neuromorphic Vision Dataset for Visual Navigation Tasks in Agriculture}}, month = {June}, year = {2021}}

Instructions: 

The dataset includes the following sequences:

  • 01_forest – Closed loop, Forest trail, No wind, Daytime
  • 02_forest – Closed loop, Forest trail, No wind, Daytime
  • 03_green_meadow – Closed loop, Meadow, grass up to 30 cm, No wind, Daytime
  • 04_green_meadow – Closed loop, Meadow, grass up to 30 cm, Mild wind, Daytime
  • 05_road_asphalt – Closed loop, Asphalt road, No wind, Nighttime
  • 06_plantation – Closed loop, Shrubland, Mild wind, Daytime
  • 07_plantation – Closed loop, Asphalt road, No wind, Nighttime
  • 08_plantation_water – Random movement, Sprinklers (water drops on camera lens), No wind, Nighttime
  • 09_cattle_farm – Closed loop, Cattle farm, Mild wind, Daytime
  • 10_cattle_farm – Closed loop, Cattle farm, Mild wind, Daytime
  • 11_cattle_farm_feed_table – Closed loop, Cattle farm feed table, Mild wind, Daytime
  • 12_cattle_farm_feed_table – Closed loop, Cattle farm feed table, Mild wind, Daytime
  • 13_ditch – Closed loop, Sandy surface, Edge of ditch or drainage channel, No wind, Daytime
  • 14_ditch – Closed loop, Sandy surface, Shore or bank, Strong wind, Daytime
  • 15_young_pines – Closed loop, Sandy surface, Pine coppice, No wind, Daytime
  • 16_winter_cereal_field – Closed loop, Winter wheat sowing, Mild wind, Daytime
  • 17_winter_cereal_field – Closed loop, Winter wheat sowing, Mild wind, Daytime
  • 18_winter_rapeseed_field – Closed loop, Winter rapeseed, Mild wind, Daytime
  • 19_winter_rapeseed_field – Closed loop, Winter rapeseed, Mild wind, Daytime
  • 20_field_with_a_cow – Closed loop, Cows tethered in pasture, Mild wind, Daytime
  • 21_field_with_a_cow – Closed loop, Cows tethered in pasture, Mild wind, Daytime

Each sequence contains the following separately downloadable files:

  • <..sequence_id..>_video.mp4 – provides an overview of the sequence data (for the DVS and RGB-D sensors).
  • <..sequence_id..>_data.tar.gz – entire date sequence in raw data format (AEDAT2.0 - DVS, images - RGB-D, point clouds in pcd files - LIDAR, and IMU csv files with original sensor timestamps). Timestamp conversion formulas are available.
  • <..sequence_id..>_rawcalib_data.tar.gz – recorded fragments that can be used to perform the calibration independently (intrinsic, extrinsic and time alignment).
  • <..sequence_id..>_rosbags.tar.gz – main sequence in ROS bag format. All sensors timestamps are aligned with DVS with an accuracy of less than 1 ms.

The contents of each archive are described below..

Raw format data

The archive <..sequence_id..>_data.tar.gz contains the following files and folders:

  • ./meta-data/ - all the useful information about the sequence
  • ./meta-data/meta-data.md - detailed information about the sequence, sensors, files, and data formats
  • ./meta-data/cad_model.pdf - sensors placement
  • ./meta-data/<...>_timeconvs.json - coefficients for timestamp conversion formulas
  • ./meta-data/ground-truth/ - movement ground-truth data, calculated using 3 different Lidar-SLAM algorithms (Cartographer, HDL-Graph, LeGo-LOAM)
  • ./meta-data/calib-params/ - intrinsic and extrinsic calibration parameters
  • ./recording/ - main sequence
  • ./recording/dvs/ - DVS events and IMU data
  • ./recording/lidar/ - Lidar point clouds and IMU data
  • ./recording/realsense/ - Realsense camera RGB, Depth frames, and IMU data
  • ./recording/sensorboard/ - environmental sensors data (temperature, humidity, air pressure)

Calibration data

The <..sequence_id..>_rawcalib_data.tar.gz archive contains the following files and folders:

  • ./imu_alignments/ - IMU recordings of the platform lifting before and after the main sequence (can be used for custom timestamp alignment)
  • ./solenoids/ - IMU recordings of the solenoid vibrations before and after the main sequence (can be used for custom timestamp alignment)
  • ./lidar_rs/ - Lidar vs Realsense camera extrinsic calibration by showing both sensors a spherical object (ball)
  • ./dvs_rs/ - DVS and Realsense camera intrinsic and extrinsic calibration frames (checkerboard pattern)

ROS Bag format data

There are six rosbag files for each scene, their contents are as follows:

  • <..sequence_id..>_dvs.bag (topics: /dvs/camera_info, /dvs/events, /dvs/imu, and accordingly message types: sensor_msgs/CameraInfo, dvs_msgs/EventArray, sensor_msgs/Imu).
  • <..sequence_id..>_lidar.bag (topics: /lidar/imu/acc, /lidar/imu/gyro, /lidar/pointcloud, and accordingly message types: sensor_msgs/Imu, sensor_msgs/Imu, sensor_msgs/PointCloud2).
  • <..sequence_id..>_realsense.bag (topics: /realsense/camera_info, /realsense/depth, /realsense/imu/acc, /realsense/imu/gyro, /realsense/rgb, /tf, and accordingly message types: sensor_msgs/CameraInfo, sensor_msgs/Image, sensor_msgs/Imu, sensor_msgs/Imu, sensor_msgs/Image, tf2_msgs/TFMessage).
  • <..sequence_id..>_sensorboard.bag (topics: /sensorboard/air_pressure, /sensorboard/relative_humidity, /sensorboard/temperature, and accordingly message types: sensor_msgs/FluidPressure, sensor_msgs/RelativeHumidity, sensor_msgs/Temperature).
  • <..sequence_id..>_trajectories.bag (topics: /cartographer, /hdl, /lego_loam, and accordingly message types: geometry_msgs/PoseStamped, geometry_msgs/PoseStamped, geometry_msgs/PoseStamped).
  • <..sequence_id..>_data_for_realsense_lidar_calibration.bag (topics: /lidar/pointcloud, /realsense/camera_info, /realsense/depth, /realsense/rgb, /tf, and accordingly message types: sensor_msgs/PointCloud2, sensor_msgs/CameraInfo, sensor_msgs/Image, sensor_msgs/Image, tf2_msgs/TFMessage).

Version history

22.06.2021.

  • Realsense data now also contain depth png images with 16-bit depth, which are located in folder /recording/realsense/depth_native/
  • Added data in rosbag format
 

Dataset Files

LOGIN TO ACCESS DATASET FILES
Open Access dataset files are accessible to all logged in  users. Don't have a login?  Create a free IEEE account.  IEEE Membership is not required.