PC-Urban Outdoor dataset for 3D Point Cloud semantic segmentation

Citation Author(s):
Muhammad
Ibrahim
University of Western Australia
Naveed
Akhtar
University of Western Australia
Michael
Wise
University of Western Australia
Ajmal
Mian
University of Western Australia
Submitted by:
Muhammad Ibrahim
Last updated:
Wed, 03/15/2023 - 03:44
DOI:
10.21227/fvqd-k603
License:
0
0 ratings - Please login to submit your rating.

Abstract 

The proposed dataset, termed PC-Urban (Urban Point Cloud), is captured with an Ouster LiDAR sensor with 64 channels. The sensor is installed on an SUV that drives through the downtown of Perth, Western Australia (WA), Australia. The dataset comprises over 4.3 billion points captured for 66K sensor frames. The labelled data is organized as registered and raw point cloud frames, where the former has a different number of registered consecutive frames. We provide 25 class labels in the dataset covering 23 million points and 5K instances. Labelling is performed with PC-Annotate and can easily be extended by the end-users employing the same tool.

The data is organized into unlabelled and labelled 3D point clouds. The unlabelled data is provided in .PCAP file format, which is the direct output format of the used Ouster LiDAR sensor. Raw frames are extracted from the recorded .PCAP files in the form of Ply and Excel files using the Ouster Studio Software. Labelled 3D point cloud data consists of registered or raw point clouds. A labelled point cloud is a combination of Ply, Excel, Labels and Summary files. A point cloud in Ply file contains X, Y, Z values along with color information. An Excel file contains X, Y, Z values, Intensity, Reflectivity, Ring, Noise, and Range of each point. These attributes can be useful in semantic segmentation using deep learning algorithms. The Label and Label Summary files have been explained in the previous section. Our one GB raw data contains nearly 1,300 raw frames, whereas 66,425 frames are provided in the dataset, each comprising 65,536 points. Hence, 4.3 billion points captured with the Ouster LiDAR sensor are provided. Annotation of 25 general outdoor classes is provided, which include car, building, bridge, tree, road, letterbox, traffic signal, light-pole, rubbish bin, cycles, motorcycle, truck, bus, bushes, road sign board, advertising board, road divider, road lane, pedestrians, side-path, wall, bus stop, water, zebra-crossing, and background. With the released data, a total of 143 scenes are annotated which include both raw and registered frames.

 

 

Instructions: 

Labelled Data:
Four files are associated with each labelled frame.  A file with. PLY format has only 3D points (X, Y, Z) while a file with CSV format contains other features such as intensity and reflectivity along with X, Y, Z.  The third file with Label.txt extension has two columns. Each value in the first column corresponds to each point in its corresponding PLY and CSV files while the second column contains an instance ID corresponding to each class. The last file is the labelling summary of a frame.
 
Unlabelled Data:
Unlabelled 3D point cloud data is stored in PCAP files. These files are associated with JSON files. Ouster Studio can be used for data reading and extraction.
1) Load a PCAF file and its corresponding JASON files. Then all frames can be read by pressing the play button at the bottom of the tool.
2) All frames can be extracted by pressing the CTRL plus S button of the keyboard.
Please do not forget to cite our paper.  https://ieeexplore.ieee.org/document/9363898

Comments

Zip file contains the following folders:
1) RawLabeledFrames
2) RegisteredLabeledFrames
3) UnlabeledData
4) OusterStudioForDataExtraction
5) PC_AnnotateForLabeling
6) RawPointCloudVideo

Submitted by Muhammad Ibrahim on Wed, 05/05/2021 - 10:39

Matlab and Python can be used for point clouds and labels reading.

Submitted by Muhammad Ibrahim on Wed, 05/05/2021 - 10:40