IEEE Transactions on Instrumentation and Measurement | 2021

Recognizing Emotional States With Wearables While Playing a Serious Game

 
 
 
 
 

Abstract


In this study, we propose the use of electroencephalography (EEG), electrooculography (EOG), and kinematic motion data captured through wearable sensors to classify emotional states, while individuals are playing a serious computer game (Whack-a-Mole). Twenty-one participants wore an OpenBCI headset and JINS MEME eyewear while playing the Whack-a-Mole game at three levels of difficulty. We used a variety of classifiers [i.e., a support vector machine (SVM), logistic regression (LR), random forest (RF), and ensemble classifier (EC)] to classify the participants’ emotional states based on their EEG, EOG, and kinematic motion data. The classifiers were trained using the International Affective Picture System (IAPS). The EC and RF showed the best results in terms of their overall performance. Using tenfold cross-validation for all the subjects, the accuracies obtained were 73% for Arousal and 80% for Valence. Our results suggest that EEG and EOG biosignals, as well as kinematic motion data acquired using off-the-shelf wearable sensors in combination with machine-learning techniques such as EC, can be used to classify emotional states, while the individuals were playing the Whack-a-Mole game.

Volume 70
Pages 1-12
DOI 10.1109/TIM.2021.3059467
Language English
Journal IEEE Transactions on Instrumentation and Measurement

Full Text