Digit. Signal Process. | 2021

Emotion recognition based on fusion of long short-term memory networks and SVMs

 
 
 
 
 
 

Abstract


Abstract This paper proposes a multimodal fusion emotion recognition method based on Dempster-Shafer evidence theory, which includes electroencephalogram (EEG) and electrocardiogram (ECG). For EEG, we use the SVM classifier to classify features, and for ECG, we establish the corresponding Bi-directional Long Short-Term Memory network emotion recognition structure, which is fused with EEG classification results through the evidence theory. We selected 25 video clips with five emotions (happy, relaxed, angry, sad, and disgusted), and a total of 20 subjects participated in our emotional experiment. The experimental results prove that the performance of the multi-modal fusion model proposed in this paper is superior to the single-modal emotion recognition model. In the Arousal and Valance dimensions, the average accuracy is improved by 2.64% and 2.75% compared with the EEG signal-based emotion recognition model. Compared with the emotion recognition model based on the ECG signal, the accuracy is improved by 7.37% and 8.73%.

Volume 117
Pages 103153
DOI 10.1016/J.DSP.2021.103153
Language English
Journal Digit. Signal Process.

Full Text