IEEE Computational Intelligence Magazine | 2019

Sensing Affective Response to Visual Narratives

 
 
 
 
 
 

Abstract


This paper introduces a multimodal approach for detecting individuals affective state while being exposed to visual narratives. We use four modalities, namely visual facial behaviors, heart rate measurements, thermal imaging, and verbal descriptions, and show that we can predict changes in the affect that people experience when they are exposed to audio-visual stimuli (either positive or negative). We conduct experiments that aim to predict the presence of affective response while being exposed to visual narratives, as well as to distinguish between positive or negative affect valence. Extensive feature analyses and experiments to predict the presence of affect demonstrate how the four modalities we explore can effectively complement each other.

Volume 14
Pages 54-66
DOI 10.1109/MCI.2019.2901086
Language English
Journal IEEE Computational Intelligence Magazine

Full Text