60 GHz FMCW Radar Gesture Dataset

Citation Author(s):
Sarah
Seifi
Infineon Technologies AG; Technical University Munich
Tobias
Sukianto
Infineon Technologies AG; Johannes Kepler University
Cecilia
Carbonelli
Infineon Technologies AG
Submitted by:
Sarah Seifi
Last updated:
Sun, 03/03/2024 - 14:10
DOI:
10.21227/s12w-cc46
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

As the field of human-computer interaction continues to evolve, there is a growing need for new methods of gesture recognition that can be used in a variety of applications, from gaming and entertainment to healthcare and robotics. While traditional methods of gesture recognition rely on cameras or other optical sensors, these systems can be limited by factors such as lighting conditions and occlusions.

To address these challenges, we have developed a new gesture dataset based on radar sensing technology. The dataset includes 21.000 carefully selected gestures, recorded using the BGT60TR13C XENSIV™ 60GHz Frequency Modulated Continuous Radar sensor.

We believe that this dataset will be an invaluable resource for researchers and developers working in the field of gesture recognition. By providing a large and diverse set of radar-based gesture data, we hope to enable the development of new and innovative applications that can enhance human-computer interaction in a wide range of domains.

Instructions: 

The radar system was configured with an operational frequency range spanning from 58.5 GHz to 62.5 GHz, providing a range resolution of 37.5mm and the ability to resolve targets at a maximum range of 1.2 meters. For signal transmission, the radar employed a burst configuration comprising 32 chirps per burst with a frame rate of 33 Hz and a pulse repetition time of 300 µs.

The gestures were performed by eight individuals in six different locations, with a field view of ±45° and a distance of ≤1 meter from the radar. The dataset includes five different gesture types, including Swipe Left, Swipe Right, Swipe Up, Swipe Down, and Push, with an average duration of 0.5 seconds or ten frames per gesture recording. The total size of the dataset is 48.1 GB

The indoor locations were a gym, a library, a kitchen, a bedroom, a shared office room, and a closed meeting room.

The dataset sequence of each gesture sample is saved as a numpy array with 4 dimensions, 100x3x32x64, where the first dimension represents the frame length of each gesture, the second dimension represents the number of virtual antennas, the third dimension represents the number of chirps in one frame and the fourth dimension represents the number of samples. The format of each numpy file is GestureName_EnvironmentLabel_UserLabel_SampleLabel.npy, representing the gesture name, the environmental label, the user label, and the sample number.

 

The environmental labels are defined as follows:

·         e1: closed-space meeting room

·         e2: open-space office room

·         e3: library

·         e4: kitchen

·         e5: exercise room

·         e6: bedroom

 

The user labels are defined as follows:

·         p1: Male

·         p2: Female

·         p3: Female

·         p4: Male

·         p5: Male

·         p6: Male

·         p7: Male

·         p8: Male