A General and Scalable Vision Framework for Functional Near-Infrared Spectroscopy Classification
Functional near-infrared spectroscopy (fNIRS), a non-invasive optical technique, is widely used to monitor brain activities for disease diagnosis and brain-computer interfaces (BCIs). Deep learning-based fNIRS classification faces three major barriers: limited datasets, confusing evaluation criteria...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Book |
Published: |
IEEE,
2022-01-01T00:00:00Z.
|
Subjects: | |
Online Access: | Connect to this object online. |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Functional near-infrared spectroscopy (fNIRS), a non-invasive optical technique, is widely used to monitor brain activities for disease diagnosis and brain-computer interfaces (BCIs). Deep learning-based fNIRS classification faces three major barriers: limited datasets, confusing evaluation criteria, and domain barriers. We apply more appropriate evaluation methods to three open-access datasets to solve the first two barriers. For domain barriers, we propose a general and scalable vision fNIRS framework that converts multi-channel fNIRS signals into multi-channel virtual images using the Gramian angular difference field (GADF). We use the framework to train state-of-the-art visual models from computer vision (CV) within a few minutes, and the classification performance is competitive with the latest fNIRS models. In cross-validation experiments, visual models achieve the highest average classification results of 78.68% and 73.92% on mental arithmetic and word generation tasks, respectively. Although visual models are slightly lower than the fNIRS models on unilateral finger- and foot-tapping tasks, the F1-score and kappa coefficient indicate that these differences are insignificant in subject-independent experiments. Furthermore, we study fNIRS signal representations and the classification performance of sequence-to-image methods. We hope to introduce rich achievements from the CV domain to improve fNIRS classification research. |
---|---|
Item Description: | 1558-0210 10.1109/TNSRE.2022.3190431 |