Independent bilateral-eye stimulation for gaze pattern recognition based on steady-state pupil light reflex

Citation Author(s):
Ariki
Sato
Tottori University
Shintaro
Nakatani
Tottori University
Submitted by:
Shintaro Nakatani
Last updated:
Mon, 03/21/2022 - 23:03
DOI:
10.21227/m3xq-nr87
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

Abstract— Objective: Recently, pupil oscillation synchronized with a steady visual stimulus was employed for an input of an interface. The system is inspired by steady-state visual evoked potential (SSVEP) BCIs, but it eliminates the need for contact with the participant because it does not need electrodes to measure electroencephalography. However, the stimulation frequency is restricted to being below 2.5 Hz because of the mechanics of pupillary vibration and information transfer rate (ITR) is lower than SSVEP BCIs. Here, we propose a signal mixture method for stimulation with different frequency patterns to increase ITR of the interface based on pupil oscillation. Methods: Pupil responses were compered between (1) the left and right eyes were stimulated with different frequencies and mixed in the participant's nerve, and (2) signals with different frequencies were mixed in advance and given to both eyes as a common stimulus. Conclusion: In the case of stimulation (2), the beat frequency, which is the difference between input frequencies, contaminated the pupil response signal. In the case of (1) independent stimulation, the power of the beat frequency was lower and thus did not affect the response signal as much. From these results, we develop a target recognition system using 15 targets consists by 3 frequency components by using independent stimulation method. Mean of classification accuracy was 78.3% and ITR was 14.0 bits/min. Significance: This method is functionally compatible with BCI using SSVEP and requires no physical contact or calibration paradigm. Therefore, this study contributes to providing a useful computer interface for patients with motor impairments.

Instructions: 

Independent bilateral-eye stimulation for gaze pattern recognition based on steady-state pupil light reflex

Data

CSV files of experiment 1 are in ./ex1/data

CSV files of experiment 2 are in ./ex2/data

CSV files of experiment 3 are in ./ex3/data

The data files are divided into directory for each participant. The CSV files recorded the pupil diameter [mm] of the left eye. If data is lost due to a blink etc., the value is -1.

Format

*.csv, *.py, *.pdf

Analysis

All python scripts for the analysis of experiment 1 are in ./ex1/analysis1.

All python scripts for the analysis of experiment 2 are in ./ex2/analysis2.

All python scripts for the analysis of experiment 3 are in ./ex3/analysis3.

analysis1

Python script "psd_fft.py" analyzes power spectrum density (PSD) for each participant. Please input a participant ID (e.g. P1).

Python script "psd_fft.py" generates temporary csv file as "{data_name}.csv" under 'BSR' directory.

Python script "helper_csv.py" puts the csv files generated by "psd_fft.py" together and save it under '/ex1/PSD/'.

Python script "all_psd_graph.py" plots PSD for each stimulus condition.

Python script "dB_BSR_getter.py" calculates signal-to-beat ratio and generate the csv file under '/ex1/BSR/'.

Python script "dB_BSR_box.py" plots the signal-to-beat ratio for each stimulus condition.

analysis2

Python script "psd_fft.py" analyzes PSD for each subject. Please input a participant ID (e.g. P1). This script calculates PSD of all data files and saves them under separate directory for each stimulus condition.

Python script "helper_take_peak_psd.py" reads the PSDs generated by "psd_fft.py".

Python script "PSD_bar_graph.py" plots the PSD bar graphs.

Python script "spectrogram_allcsv.py" plots the spectrogram by using Burg method.

Python script "classification_clossval.py" calculates accuracies by using cross-validation.

Python script "acc_graph.py" plots the different accuracies for the different classification times.

Python script "ITR_graph.py" plots the different information transfer rates (ITR) for the different classification times.

analysis3

Python script "re-classification.py" outputs the accuracy and information transfer rates (ITR), and plots the estimation confusion matrix.

Note

Please keep the directory structure. All generated figures are saved under the '/ex1/figure/' or '/ex2/figure/'.

Funding Agency: 
The Mazda Foundation