Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation

The past few years have witnessed a remarkable advance in deep learning for EEG-based sleep stage classification (SSC). However, the success of these models is attributed to possessing a massive amount of labeled data for training, limiting their applicability in real-world scenarios. In such scenar...

Full description

Saved in:
Bibliographic Details
Main Authors: Emadeldeen Eldele (Author), Mohamed Ragab (Author), Zhenghua Chen (Author), Min Wu (Author), Chee-Keong Kwoh (Author), Xiaoli Li (Author)
Format: Book
Published: IEEE, 2023-01-01T00:00:00Z.
Subjects:
Online Access:Connect to this object online.
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000 am a22000003u 4500
001 doaj_dfb8d05250534f0e9b7da3d9dfbcc7a0
042 |a dc 
100 1 0 |a Emadeldeen Eldele  |e author 
700 1 0 |a Mohamed Ragab  |e author 
700 1 0 |a Zhenghua Chen  |e author 
700 1 0 |a Min Wu  |e author 
700 1 0 |a Chee-Keong Kwoh  |e author 
700 1 0 |a Xiaoli Li  |e author 
245 0 0 |a Self-Supervised Learning for Label- Efficient Sleep Stage Classification: A Comprehensive Evaluation 
260 |b IEEE,   |c 2023-01-01T00:00:00Z. 
500 |a 1558-0210 
500 |a 10.1109/TNSRE.2023.3245285 
520 |a The past few years have witnessed a remarkable advance in deep learning for EEG-based sleep stage classification (SSC). However, the success of these models is attributed to possessing a massive amount of labeled data for training, limiting their applicability in real-world scenarios. In such scenarios, sleep labs can generate a massive amount of data, but labeling can be expensive and time-consuming. Recently, the self-supervised learning (SSL) paradigm has emerged as one of the most successful techniques to overcome labels’ scarcity. In this paper, we evaluate the efficacy of SSL to boost the performance of existing SSC models in the few-labels regime. We conduct a thorough study on three SSC datasets, and we find that fine-tuning the pretrained SSC models with only 5% of labeled data can achieve competitive performance to the supervised training with full labels. Moreover, self-supervised pretraining helps SSC models to be more robust to data imbalance and domain shift problems. 
546 |a EN 
690 |a Sleep stage classification 
690 |a EEG 
690 |a self-supervised learning 
690 |a label-efficient learning 
690 |a Medical technology 
690 |a R855-855.5 
690 |a Therapeutics. Pharmacology 
690 |a RM1-950 
655 7 |a article  |2 local 
786 0 |n IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol 31, Pp 1333-1342 (2023) 
787 0 |n https://ieeexplore.ieee.org/document/10044720/ 
787 0 |n https://doaj.org/toc/1558-0210 
856 4 1 |u https://doaj.org/article/dfb8d05250534f0e9b7da3d9dfbcc7a0  |z Connect to this object online.