ExoNet Database: Wearable Camera Images of Human Locomotion Environments

Citation Author(s):
Brokoslaw
Laschowski
University of Waterloo
William
McNally
University of Waterloo
Alexander
Wong
University of Waterloo
John
McPhee
University of Waterloo
Submitted by:
Brokoslaw Laschowski
Last updated:
Wed, 03/13/2024 - 06:21
DOI:
10.21227/rz46-2n31
Links:
License:
5
1 rating - Please login to submit your rating.

Abstract 

Advances in computer vision and deep learning are enabling automated environment recognition systems for robotic leg prostheses and exoskeletons. However, small-scale and private training datasets have impeded the widespread development and dissemination of image classification algorithms like convolutional neural networks to recognize the human walking environment. To address these limitations, we developed "ExoNet" - the first open-source, large-scale hierarchical database of wearable camera images (i.e., egocentric perception) of real-world walking environments. Unparalleled in both scale and diversity, ExoNet contains over 5.6 million RGB images of indoor and outdoor walking environments, which were collected using a lightweight wearable camera throughout the summer, fall, and winter. Approximately 923,000 images in ExoNet were human-annotated using a new 12-class hierarchical labelling architecture. Available publicly through IEEE DataPort, ExoNet serves as a communal platform to train, develop, and compare next-generation image classification algorithms for visual perception of human walking environments. In addition to robotic leg prostheses and exoskeletons, applications of ExoNet can extend to humanoids, autonomous legged robots, powered wheelchairs, and other mobility assistive technologies. 

References:

1) Laschowski B, McNally W, Wong A, and McPhee J. (2020). ExoNet Database: Wearable Camera Images of Human Locomotion Environments. Frontiers in Robotics and AI. DOI: 10.3389/frobt.2020.562061.

2) Laschowski B, McNally W, Wong A, and McPhee J. (2021). Computer Vision and Deep Learning for Environment-Adaptive Control of Robotic Lower-Limb Exoskeletons. Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). DOI: 10.1109/EMBC46164.2021.9630064. 

3) Laschowski B (2021). Energy Regeneration and Environment Sensing for Robotic Leg Prostheses and Exoskeletons. PhD Dissertation. University of Waterloo. http://hdl.handle.net/10012/17816. 

4) Laschowski B, McNally W, Wong A, and McPhee J. (2022). Environment Classification for Robotic Leg Prostheses and Exoskeletons using Deep Convolutional Neural Networks. Frontiers in Neurorobotics. DOI: 10.3389/fnbot.2021.730965.

Instructions: 

*Details on the ExoNet database are provided in the references above. Please email Dr. Brokoslaw Laschowski (blaschow@uwaterloo.ca) for any additional questions and/or technical assistance. 

Comments

I need to access this dataset for academic reasons.

Submitted by Shanshan Lao on Sat, 01/30/2021 - 20:14

I need to access this dataset for academic reasons.

Submitted by Jiangpeng Ni on Mon, 03/15/2021 - 03:44

I need this dataset for academic reasons.

Submitted by marzie khalili on Sun, 09/05/2021 - 15:41

i need this dataset for academic reasons.

Submitted by mina ekresh on Sun, 09/05/2021 - 16:03

i need this dataset for academic reasons.

Submitted by ZB Qiao on Sun, 11/13/2022 - 01:40

What is the difference between ExoNet_Database and ExoNet_Images?

Submitted by Prajnan Karmakar on Thu, 02/09/2023 - 09:07

What is the difference between ExoNet_Database and ExoNet_Images?

Submitted by Prajnan Karmakar on Thu, 02/09/2023 - 09:07

Can anyone really train this dataset? How do I feel that a lot of labels are wrong?

Submitted by ZB Qiao on Sat, 03/18/2023 - 02:03

Dataset Files

LOGIN TO ACCESS DATASET FILES
Open Access dataset files are accessible to all logged in  users. Don't have a login?  Create a free IEEE account.  IEEE Membership is not required.