Multimodal Emotion Recognition Based on EEG and EOG Signals Evoked by the Video-Odor Stimuli

Affective data is the basis of emotion recognition, which is mainly acquired through extrinsic elicitation. To investigate the enhancing effects of multi-sensory stimuli on emotion elicitation and emotion recognition, we designed an experimental paradigm involving visual, auditory, and olfactory sen...

Full description

Saved in:
Bibliographic Details
Main Authors: Minchao Wu (Author), Wei Teng (Author), Cunhang Fan (Author), Shengbing Pei (Author), Ping Li (Author), Guanxiong Pei (Author), Taihao Li (Author), Wen Liang (Author), Zhao Lv (Author)
Format: Book
Published: IEEE, 2024-01-01T00:00:00Z.
Subjects:
Online Access:Connect to this object online.
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000 am a22000003u 4500
001 doaj_ccf8b1eba83c40a1b81e420b42f23ef9
042 |a dc 
100 1 0 |a Minchao Wu  |e author 
700 1 0 |a Wei Teng  |e author 
700 1 0 |a Cunhang Fan  |e author 
700 1 0 |a Shengbing Pei  |e author 
700 1 0 |a Ping Li  |e author 
700 1 0 |a Guanxiong Pei  |e author 
700 1 0 |a Taihao Li  |e author 
700 1 0 |a Wen Liang  |e author 
700 1 0 |a Zhao Lv  |e author 
245 0 0 |a Multimodal Emotion Recognition Based on EEG and EOG Signals Evoked by the Video-Odor Stimuli 
260 |b IEEE,   |c 2024-01-01T00:00:00Z. 
500 |a 1534-4320 
500 |a 1558-0210 
500 |a 10.1109/TNSRE.2024.3457580 
520 |a Affective data is the basis of emotion recognition, which is mainly acquired through extrinsic elicitation. To investigate the enhancing effects of multi-sensory stimuli on emotion elicitation and emotion recognition, we designed an experimental paradigm involving visual, auditory, and olfactory senses. A multimodal emotional dataset (OVPD-II) that employed the video-only or video-odor patterns as the stimuli materials, and recorded the electroencephalogram (EEG) and electrooculogram (EOG) signals, was created. The feedback results reported by subjects after each trial demonstrated that the video-odor pattern outperformed the video-only pattern in evoking individuals’ emotions. To further validate the efficiency of the video-odor pattern, the transformer was employed to perform the emotion recognition task, where the highest accuracy reached 86.65% (66.12%) for EEG (EOG) modality with the video-odor pattern, which improved by 1.42% (3.43%) compared with the video-only pattern. What’s more, the hybrid fusion (HF) method combined with the transformer and joint training was developed to improve the performance of the emotion recognition task, which achieved classify accuracies of 89.50% and 88.47% for the video-odor and video-only patterns, respectively. 
546 |a EN 
690 |a Electroencephalogram (EEG) 
690 |a electrooculogram (EOG) 
690 |a emotion recognition 
690 |a video-odor stimuli 
690 |a multi-modal fusion 
690 |a Medical technology 
690 |a R855-855.5 
690 |a Therapeutics. Pharmacology 
690 |a RM1-950 
655 7 |a article  |2 local 
786 0 |n IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol 32, Pp 3496-3505 (2024) 
787 0 |n https://ieeexplore.ieee.org/document/10672559/ 
787 0 |n https://doaj.org/toc/1534-4320 
787 0 |n https://doaj.org/toc/1558-0210 
856 4 1 |u https://doaj.org/article/ccf8b1eba83c40a1b81e420b42f23ef9  |z Connect to this object online.