Virtual Reality Gesture Recognition Dataset

Citation Author(s):
Dag
Eklund
Ilias
Siniosoglou
Anna
Triantafyllou
Athanasios
Liatifis
Dimitrios
Pliatsios
Thomas
Lagkas
Vasileios
Argyriou
Panagiotis
Sarigiannidis
Submitted by:
Panagiotis Sari...
Last updated:
Tue, 05/30/2023 - 13:28
DOI:
10.21227/kyzx-m451
Data Format:
Research Article Link:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

This dataset provides valuable insights into hand gestures and their associated measurements. Hand gestures play a significant role in human communication, and understanding their patterns and characteristics can be enabled various applications, such as gesture recognition systems, sign language interpretation, and human-computer interaction. This dataset was carefully collected by a specialist who captured snapshots of individuals making different hand gestures and measured specific distances between the fingers and the palm. The dataset offers a comprehensive view of these measurements, allowing for further analysis and exploration of the relationships between different gestures and their corresponding hand measurements.

The dataset's potential applications are wide-ranging. For instance, it can be used to develop gesture recognition systems that can identify and interpret hand movements accurately. By training machine learning models on this dataset, it is possible to create algorithms capable of recognizing specific hand gestures based on the measured distances. This can enable intuitive human-machine interaction and interfacing, particularly in domains such as virtual reality, augmented reality, and smart devices. Moreover, researchers interested in the biomechanics of hand movements or exploring the cultural significance of specific gestures can leverage this dataset to gain insights into the physical aspects of hand gestures and their variations across different individuals.

Instructions: 

1. Introduction

This dataset provides valuable insights into hand gestures and their associated measurements. Hand gestures play a significant role in human communication, and understanding their patterns and characteristics can be enabled various applications, such as gesture recognition systems, sign language interpretation, and human-computer interaction. This dataset was carefully collected by a specialist who captured snapshots of individuals making different hand gestures and measured specific distances between the fingers and the palm [1]. The dataset offers a comprehensive view of these measurements, allowing for further analysis and exploration of the relationships between different gestures and their corresponding hand measurements.

The dataset's potential applications are wide-ranging. For instance, it can be used to develop gesture recognition systems that can identify and interpret hand movements accurately. By training machine learning models on this dataset, it is possible to create algorithms capable of recognizing specific hand gestures based on the measured distances. This can enable intuitive human-machine interaction and interfacing, particularly in domains such as virtual reality, augmented reality, and smart devices. Moreover, researchers interested in the biomechanics of hand movements or exploring the cultural significance of specific gestures can leverage this dataset to gain insights into the physical aspects of hand gestures and their variations across different individuals.

2. Citation

Please cite the following papers when using this dataset:

[1] I. Siniosoglou, V. Argyriou, P. Sarigiannidis, T. Lagkas, A. Sarigiannidis, S. K. Goudos, and S. Wan, “Post-processing fairness evaluation of federated models: An unsupervised approach in healthcare,” IEEE/ACM Transactions on Computational Biology and Bioinformatics, pp. 1–12, 2023

https://ieeexplore.ieee.org/document/10107702

3. Dataset Modalities 

The dataset provided contains a collection of measurements related to hand gestures. These measurements are crucial for understanding and analysing the movements and positions of the hand during various gestures. Each row represents a specific hand gesture, while the columns represent different measurements associated with the hand's anatomy. The measurements include distances between different fingers and the palm, as well as the distance between specific finger pairs. These measurements provide valuable insights into the intricate motions and positions of the hand, enabling researchers and practitioners to develop models and systems for hand gesture recognition and analysis. The dataset consists of five rows, each representing a different hand gesture, and provides a comprehensive set of measurements for each gesture, allowing for detailed analysis and interpretation.

3.1 Data Collection

The dataset provided consists of measurements related to hand gestures and was collected through a meticulous process by a specialist. The data collection process involved capturing snapshots of individuals performing various hand gestures and measuring the specific distances described in the dataset. 

The specialist would take precise measurements of the distances between different fingers and the palm of the hand. These measurements were likely obtained using digital measuring tools to ensure accuracy. 

The process of collecting this dataset involved meticulous attention to detail and precision. The specialist would record the distances between the thumb, index finger, middle finger, ring finger, and pinky finger to the palm of the hand. Additionally, distances between adjacent fingers, such as the thumb and index finger, index finger and middle finger, middle finger and ring finger, and ring finger and pinky finger, were also measured. These measurements provide a detailed understanding of the spatial relationships and proportions within hand gestures. It is important to note that the dataset contains measurements from various individuals, representing a diverse range of hand shapes and sizes to ensure its generalizability.

The purpose of this data collection process is to gather information for training a machine learning model or conducting research on hand gestures to aid the interactivity of AR/VR applications. It is important to note that the dataset provided does not contain information about the exact process or the number of individuals involved in the data collection.

4. Dataset Federation Process

 In order to separate the dataset, a different probability was assigned to each class for the three datasets. The original dataset was iterated once, moving samples to a split dataset based on the provided probability of each sample of a class, without repetition.

More information could be found in the ReadMe file.

5. Use of the Dataset

The dataset is separated into three sub-datasets containing different distributions of the provided classes. Each dataset is to be installed to one node. In particular: 

NodeDataset

Node 0gestures_s_split0.csv

Node 1gestures_s_split1.csv

Node 2gestures_s_split2.csv

Based on this, each node will contain their own local dataset with non-i.i.d distribution, in respect to the rest, on which the Federated Learning can measure its classification performance.

6. Acknowledgement

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 957406 (TERMINET).

Funding Agency: 
H2020
Grant Number: 
957406