Archive | 2019
Augmenting EEG with Inertial Sensing for Improved 4-Class Subject-Independent Emotion Classification in Virtual Reality
Abstract
This investigation reports on the promising results obtained from the novel use of inertial sensing data for augmenting electroencephalography (EEG)-based subjectindependent classification of emotions generated by virtual reality stimuli in four classes. 31 users were shown various virtual reality scenes to elicit responses in the four-quadrant emotional space according to Russell’s Circumplex Model of Affect. Prior studies in emotion classification can be broadly grouped according to two main delineations of investigative methods: (1) whether the classification is binary (i.e. two-class classification) or otherwise (e.g. three-class, four-class classification or more) and (2) whether the training and testing occurs within the same participant (also known as subject-dependent classification) or across different participants (also known as subject-independent classification). Due to the significantly higher level of difficulty in conducting ternary/quaternary, subject-independent classification, the large majority of emotion modeling studies that report high accuracy rates adopts the binary, subject-dependent approach to classification. However, this study attempts the more challenging four-class classification, subject-independent classification. The EEG signals, accelerometer, and gyroscopic data were acquired through a wearable brain-computer interface device called Muse. Raw as well as power spectral density features were extracted from the EEG signals, and together with the first known use of inertial sensing data for emotion classification, were used as input to a deep neural network. Classification results show that without inertial sensing data, inter-subject classification was indeed highly challenging even for a deep learning system with only slightly better than random for 4-class classification at 26-27%. However, the augmentation of inertial sensing data improved the classification accuracy to 40-47%. As such, this work demonstrates the potential of using inertial sensing as an additional modality for emotion modeling. Keyword : Inertial sensing, emotion classification, virtual reality, 4-quadrant emotion recognition, electroencephalography.