Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stefanie Rukavina is active.

Publication


Featured researches published by Stefanie Rukavina.


Acta Psychologica | 2010

Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men.

Holger Hoffmann; Henrik Kessler; Tobias Eppel; Stefanie Rukavina; Harald C. Traue

Two experiments were conducted in order to investigate the effect of expression intensity on gender differences in the recognition of facial emotions. The first experiment compared recognition accuracy between female and male participants when emotional faces were shown with full-blown (100% emotional content) or subtle expressiveness (50%). In a second experiment more finely grained analyses were applied in order to measure recognition accuracy as a function of expression intensity (40%-100%). The results show that although women were more accurate than men in recognizing subtle facial displays of emotion, there was no difference between male and female participants when recognizing highly expressive stimuli.


PLOS ONE | 2016

Affective Computing and the Impact of Gender and Age

Stefanie Rukavina; Sascha Gruss; Holger Hoffmann; Jun-Wen Tan; Steffen Walter; Harald C. Traue

Affective computing aims at the detection of users’ mental states, in particular, emotions and dispositions during human-computer interactions. Detection can be achieved by measuring multimodal signals, namely, speech, facial expressions and/or psychobiology. Over the past years, one major approach was to identify the best features for each signal using different classification methods. Although this is of high priority, other subject-specific variables should not be neglected. In our study, we analyzed the effect of gender, age, personality and gender roles on the extracted psychobiological features (derived from skin conductance level, facial electromyography and heart rate variability) as well as the influence on the classification results. In an experimental human-computer interaction, five different affective states with picture material from the International Affective Picture System and ULM pictures were induced. A total of 127 subjects participated in the study. Among all potentially influencing variables (gender has been reported to be influential), age was the only variable that correlated significantly with psychobiological responses. In summary, the conducted classification processes resulted in 20% classification accuracy differences according to age and gender, especially when comparing the neutral condition with four other affective states. We suggest taking age and gender specifically into account for future studies in affective computing, as these may lead to an improvement of emotion recognition accuracy.


PLOS ONE | 2016

Recognition of Intensive Valence and Arousal Affective States via Facial Electromyographic Activity in Young and Senior Adults.

Jun-Wen Tan; Adriano O. Andrade; Hang Li; Steffen Walter; David Hrabal; Stefanie Rukavina; Kerstin Limbrecht-Ecklundt; Holger Hoffman; Harald C. Traue

Background Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling machines to access to human affective states. Numerous studies have investigated the effects of valence emotions on facial EMG activity captured over the corrugator supercilii (frowning muscle) and zygomaticus major (smiling muscle). The arousal emotion, specifically, has not received much research attention, however. In the present study, we sought to identify intensive valence and arousal affective states via facial EMG activity. Methods Ten blocks of affective pictures were separated into five categories: neutral valence/low arousal (0VLA), positive valence/high arousal (PVHA), negative valence/high arousal (NVHA), positive valence/low arousal (PVLA), and negative valence/low arousal (NVLA), and the ability of each to elicit corresponding valence and arousal affective states was investigated at length. One hundred and thirteen participants were subjected to these stimuli and provided facial EMG. A set of 16 features based on the amplitude, frequency, predictability, and variability of signals was defined and classified using a support vector machine (SVM). Results We observed highly accurate classification rates based on the combined corrugator and zygomaticus EMG, ranging from 75.69% to 100.00% for the baseline and five affective states (0VLA, PVHA, PVLA, NVHA, and NVLA) in all individuals. There were significant differences in classification rate accuracy between senior and young adults, but there was no significant difference between female and male participants. Conclusion Our research provides robust evidences for recognition of intensive valence and arousal affective states in young and senior adults. These findings contribute to the successful future application of facial EMG for identifying user affective states in human machine interaction (HMI) or companion robotic systems (CRS).


international conference on human-computer interaction | 2015

Emotion Elicitation Using Film Clips: Effect of Age Groups on Movie Choice and Emotion Rating

Dilana Hazer; Xueyao Ma; Stefanie Rukavina; Sascha Gruss; Steffen Walter; Harald C. Traue

In affective computing an accurate emotion recognition process requires a reliable emotion elicitation method. One of the arising questions while inducing emotions for computer-based emotional applications is age group differences. In the present study, we investigate the effect of emotion elicitation on various age groups. Emotion elicitation was conducted using standardized movie clips representing five basic emotions: amusement, sadness, anger, disgust and fear. Each emotion was elicited by three different clips. The different clips are individually rated and the subjective choice of the most relevant clip is analyzed. The results show the influence of age on film-clip choice, the correlation between age and valence/arousal rating for the chosen clips and the differences in valence and arousal ratings in the different age groups.


international conference on pattern recognition applications and methods | 2015

Paradigms for the Construction and Annotation of Emotional Corpora for Real-world Human-Computer-Interaction

Markus Kächele; Stefanie Rukavina; Günther Palm; Friedhelm Schwenker; Martin Schels

A major building block for the construction of reliable statistical classifiers in the context of affective human-computer interaction is the collection of training samples that appropriately reflect the complex nature of the desired patterns. This is especially in this application a non-trivial issue as, even though it is easily agreeable that emotional patterns should be incorporated in future computer operating, it is by far not clear how it should be realized. There are still open questions such as which types of emotional patterns to consider together with their degree of helpfulness for computer interactions and the more fundamental question on what emotions do actually occur in this context. In this paper we start by reviewing existing corpora and the respective techniques for the generation of emotional contents and further try to motivate and establish approaches that enable to gather, identify and categorize patterns of human-computer interaction. %Thus we believe it is possible to gather valid and relevant data material for the affective computing community.


Journal of Psychology Research | 2012

The Influence of Naturalness, Attractiveness and Intensity on Facial Emotion Recognition

Kerstin Limbrecht; Stefanie Rukavina; Andreas Scheck; Steffen Walter; Holger Hoffmann; Harald C. Traue

Understanding the determinants facial emotion recognition is still one of the main topics in emotion research. Mimic expressions are not only a representation of feelings caused by emotions, but are also an important communication channel in social interaction. Research of the last 20 years could show that emotion recognition abilities differ within and between individuals. Moreover, every basic emotion seems to be processed differently in the human brain. Underlying processes are still not clear. In this study, the common practice of presenting pictures of faces showing basic emotions to participants was used in a more methodological sense. Possible determining factors in facial emotion recognition, like naturalness and intensity of the expressed emotion and the attractiveness of the photographed person, were analyzed.


international conference on human-computer interaction | 2013

The Impact of Gender and Sexual Hormones on Automated Psychobiological Emotion Classification

Stefanie Rukavina; Sascha Gruss; Jun-Wen Tan; David Hrabal; Steffen Walter; Harald C. Traue; Lucia Jerg-Bretzke

It is a challenge to make cognitive technical systems more empathet- ic for user emotions and dispositions. Among channels like facial behavior and nonverbal cues, psychobiological patterns of emotional or dispositional beha- vior contain rich information, which is continuously available and hardly wil- lingly controlled. However, within this area of research, gender differences or even hormonal cycle effects as potential factors in influencing the classification of psychophysiological patterns of emotions have rarely been analyzed so far. In our study, emotions were induced with a blocked presentation of pictures from the International Affective Picture System (IAPS) and Ulm pictures. For the automated emotion classification in a first step 5 features from the heart rate signal were calculated and in a second step combined with two features of the facial EMG. The study focused mainly on gender differences in automated emotion classification and to a lesser degree on classification accuracy with Support Vector Machine (SVM) per se. We got diminished classification results for a gender mixed population and also we got diminished results for mixing young females with their hormonal cycle phases. Thus, we could show an im- provement of the accuracy rates when subdividing the population according to their gender, which is discussed as a possibility of incrementing automated classification results.


biomedical engineering and informatics | 2012

On the use of instantaneous mean frequency estimated from the Hilbert spectrum of facial electromyography for differentiating core affects

Jun-Wen Tan; Adriano O. Andrade; Steffen Walter; Hang Li; David Hrabal; Stefanie Rukavina; Kerstin Limbrecht-Ecklundt; Harald C. Traue

Facial expression comprised of single or combination of facial action units (AUs) is one of the most important communication channels transferring affective states in our social life. Electromyography (EMG) activity captured over specific facial muscles could be used for interpreting ones affective experience. The present study investigated the effects of facial EMG changes in mean instantaneous mean frequency (IMNF) captured over corrugator supercilii and zygomaticus major for differentiating core affects. Firstly, an adaptive filter has been employed to deal with power line interference in the preprocessing of EMG signal. Secondly, a novel method - IMNF estimated via Hilbert spectrum was applied, which enables us to represent the characteristic of facial EMG in joint time-frequency domain. Thirdly, mean IMNF changes of facial EMG activity elicited by affective picture blocks were computed for differentiating core affects which describe affect in valence and arousal. The results indicated that low valence/high arousal affective state can be discriminated by corrugator supercilii EMG from neutral valence/low arousal, high valence/high arousal, high valence/low arousal and low valence/low arousal affective conditions, in which corrugator supercilii EMG in response to low valence/high arousal stimuli was greater than others. In contrast, zygomaticus major EMG indicated high valence/high arousal affective state with the highest response, compared to other four affective states; Zygomaticus major EMG also revealed significant differences between low valence/low arousal and high valence/low arousal, low valence/low arousal and low valence/high arousal.


IFAC Proceedings Volumes | 2012

Emotion Identification and Modelling on the Basis of Paired Physiological Data Features for Companion Systems

David Hrabal; Stefanie Rukavina; Kerstin Limbrecht; Sascha Gruss; Steffen Walter; Vladimir Hrabal; Harald C. Traue

Abstract A technical companion system should be able to detect its users emotion and model the users emotional state in order to react to it accordingly. We have developed a novel method to determine a users most significant emotional change in the two emotion dimensions of pleasure and arousal on the basis of paired data features of physiological data when comparing two events. An experiment was set up where participants first viewed blocked IAPS picture presentations and then took part in a mental training wizard-of-oz scenario. Six meaningful features from four physiological channels of the IAPS picture presentation data - containing two electromyography channels ( corrugator supercilii and zygomaticus major ), skin conductance and peripheral blood volume - were extracted. Three pairs of features were found to contain valuable information about emotional changes when comparing two situations with different emotional contents. The method was then tested on a new blocked IAPS dataset and on the wizard-of-oz interaction scenario dataset to verify its performance. In 75% of the subjects, the detected emotion matched one of the two induced emotions. This information could be used by future companion technologies for modelling of the users current emotional state in the two-dimensional emotion space of pleasure and arousal.


Journal of Psychology Research | 2013

Sexual Hormones Influence Gray's Theory of Personality

Stefanie Rukavina; Kerstin Limbrecht-Ecklundt; David Hrabal; Steffen Walter; Harald C. Traue

Personality is believed to be stable throughout adulthood. However, many personality theories are based on the construct of measuring subjectively reported reactions of a person during specific, mostly emotional situations. Thus, the component “emotion” plays a cr ucial role in determining personality traits, especially within the BIS (behavioral inhibition system)/BAS (behavioral activation system) questionnaire. Sexual hormones, in terms of the menstrual cycle, are found to be influential on emo tional processing (e.g., emotion recognition, mood, and emotional evaluations). As a consequence, we hypothesized that personality measured with the BIS/BAS shows menstrual cycle effects and thus gender differences dependent of the menstrual cycle phase. In Study I, we collected BIS/BAS data from female subjects (n = 48) in different menstrual cycle phases (23 follicular/25 luteal). Comparing both phases, we found significant values within the BAS-dimension for the follicular women with medium effect sizes. In Study II, we collected additionally BIS/BAS data from men (n = 22). Comparing them to the female data, we found significant higher BIS values for women in genera l and especially for luteal women. These differences are discussed to be estrogen and progesterone mediated.

Collaboration


Dive into the Stefanie Rukavina's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge