Harald C. Traue
University of Ulm
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Harald C. Traue.
Acta Psychologica | 2010
Holger Hoffmann; Henrik Kessler; Tobias Eppel; Stefanie Rukavina; Harald C. Traue
Two experiments were conducted in order to investigate the effect of expression intensity on gender differences in the recognition of facial emotions. The first experiment compared recognition accuracy between female and male participants when emotional faces were shown with full-blown (100% emotional content) or subtle expressiveness (50%). In a second experiment more finely grained analyses were applied in order to measure recognition accuracy as a function of expression intensity (40%-100%). The results show that although women were more accurate than men in recognizing subtle facial displays of emotion, there was no difference between male and female participants when recognizing highly expressive stimuli.
international conference on human computer interaction | 2011
Steffen Walter; Stefan Scherer; Martin Schels; Michael Glodek; David Hrabal; Miriam Schmidt; Ronald Böck; Kerstin Limbrecht; Harald C. Traue; Friedhelm Schwenker
The design of intelligent personalized interactive systems, having knowledge about the users state, his desires, needs and wishes, currently poses a great challenge to computer scientists. In this study we propose an information fusion approach combining acoustic, and biophysiological data, comprising multiple sensors, to classify emotional states. For this purpose a multimodal corpus has been created, where subjects undergo a controlled emotion eliciting experiment, passing several octants of the valence arousal dominance space. The temporal and decision level fusion of the multiple modalities outperforms the single modality classifiers and shows promising results.
british machine vision conference | 2013
Philipp Werner; Ayoub Al-Hamadi; Robert Niese; Steffen Walter; Sascha Gruss; Harald C. Traue
Pain is what the patient says it is. But what about these who cannot utter? Automatic pain monitoring opens up prospects for better treatment, but accurate assessment of pain is challenging due to the subjective nature of pain. To facilitate advances, we contribute a new dataset, the BioVid Heat Pain Database which contains videos and physiological data of 90 persons subjected to well-defined pain stimuli of 4 intensities. We propose a fully automatic recognition system utilizing facial expression, head pose information and their dynamics. The approach is evaluated with the task of pain detection on the new dataset, also outlining open challenges for pain monitoring in general. Additionally, we analyze the relevance of head pose information for pain recognition and compare person-specific and general classification models.
The Physician and Sportsmedicine | 2011
Henrik Kessler; Cornelia Doyen-Waldecker; Christian Hofer; Holger Hoffmann; Harald C. Traue; Birgit Abler
AIM This study investigated brain areas involved in the perception of dynamic facial expressions of emotion. METHODS A group of 30 healthy subjects was measured with fMRI when passively viewing prototypical facial expressions of fear, disgust, sadness and happiness. Using morphing techniques, all faces were displayed as still images and also dynamically as a film clip with the expressions evolving from neutral to emotional. RESULTS Irrespective of a specific emotion, dynamic stimuli selectively activated bilateral superior temporal sulcus, visual area V5, fusiform gyrus, thalamus and other frontal and parietal areas. Interaction effects of emotion and mode of presentation (static/dynamic) were only found for the expression of happiness, where static faces evoked greater activity in the medial prefrontal cortex. CONCLUSIONS Our results confirm previous findings on neural correlates of the perception of dynamic facial expressions and are in line with studies showing the importance of the superior temporal sulcus and V5 in the perception of biological motion. Differential activation in the fusiform gyrus for dynamic stimuli stands in contrast to classical models of face perception but is coherent with new findings arguing for a more general role of the fusiform gyrus in the processing of socially relevant stimuli.
Journal of Psychosomatic Research | 1985
Harald C. Traue; Andreas Gottwald; Peter R. Henderson; Donald A. Bakal
This study explored the relationship between musculoskeletal responses and nonverbal expressiveness in response to psychosocial stress. Muscle-contraction headache subjects and normal controls were confronted with a psychological stressor while forehead and neck EMG activity were recorded. Indices of nonverbal expressiveness (head and hand movements, facial tension, facial activity, and facial expressiveness) were obtained concomitantly with the muscle data. The headache subjects showed greater muscle activation than the controls in response to stress, greater evidence of facial tension, and less evidence of facial and bodily expressiveness. Overall, these data provided support for the notion that under some conditions a negative relationship exists between expressiveness and somatic activation.
systems man and cybernetics | 2013
Steffen Walter; Jonghwa Kim; David Hrabal; Stephen Crawcour; Henrik Kessler; Harald C. Traue
The goal of automatic biopsychological emotion recognition of companion technologies is to ensure reliable and valid classification rates. In this paper, emotional states were induced via a Wizard-of-Oz mental trainer scenario, which is based on the valence-arousal-dominance model. In most experiments, classification algorithms are tested via leave-out cross-validation of one situation. These studies often show very high classification rates, which are comparable with those in our experiment (92.6%). However, in order to guarantee robust emotion recognition based on biopsychological data, measurements have to be taken across several situations with the goal of selecting stable features for individual emotional states. For this purpose, our mental trainer experiment was conducted twice for each subject with a 10-min break between the two rounds. It is shown that there are robust psychobiological features that can be used for classification (70.1%) in both rounds. However, these are not the same as those that were found via feature selection performed only on the first round (classification: 53.0%).
Behavioural Brain Research | 2014
Sebastian Jongen; Nikolai Axmacher; Nico A.W. Kremers; Holger Hoffmann; Kerstin Limbrecht-Ecklundt; Harald C. Traue; Henrik Kessler
Alexithymia is a personality trait that involves difficulties identifying emotions and describing feelings. It is hypothesized that this includes facial emotion recognition but limited knowledge exists about possible neural correlates of this assumed deficit. We hence tested thirty-seven healthy subjects with either a relatively high or low degree of alexithymia (HDA versus LDA), who performed in a reliable and standardized test of facial emotion recognition (FEEL, Facially Expressed Emotion Labeling) in the functional MRI. LDA subjects had significantly better emotion recognition scores and showed relatively more activity in several brain areas associated with alexithymia and emotional awareness (anterior cingulate cortex), and the extended system of facial perception concerned with aspects of social communication and emotion (amygdala, insula, striatum). Additionally, LDA subjects had more activity in the visual area of social perception (posterior part of the superior temporal sulcus) and the inferior frontal cortex. HDA subjects, on the other hand, exhibited greater activity in the superior parietal lobule. With differences in behaviour and brain responses between two groups of otherwise healthy subjects, our results indirectly support recent conceptualizations and epidemiological data, that alexithymia is a dimensional personality trait apparent in clinically healthy subjects rather than a categorical diagnosis only applicable to clinical populations.
Cognition & Emotion | 2010
Holger Hoffmann; Harald C. Traue; Franziska Bachmayr; Henrik Kessler
The presentation of facial displays of emotions is an important method in emotion-recognition studies in various basic and applied settings. This study intends to make a methodological contribution and investigates the perceived realism of dynamic facial expressions for six emotions (fear, sadness, anger, happiness, disgust, and surprise). We presented dynamic displays of faces evolving from a neutral to an emotional expression (onsets) and faces evolving from an emotional expression to a neutral one (offsets). Participants rated the perceived realism of stimuli of different durations (240–3040 ms) and adjusted the duration of each sequence until they perceived it as maximally realistic. Durations perceived as most realistic are reported for each emotion, providing an important basis for the construction of dynamic facial stimuli for future research.
2013 IEEE International Conference on Cybernetics (CYBCO) | 2013
Steffen Walter; Sascha Gruss; Hagen Ehleiter; Jun-Wen Tan; Harald C. Traue; Philipp Werner; Ayoub Al-Hamadi; Stephen Crawcour; Adriano O. Andrade; Gustavo Moreira da Silva
The objective measurement of subjective, multi-dimensionally experienced pain is still a problem that has yet to be adequately solved. Though verbal methods (i.e., pain scales, questionnaires) and visual analogue scales are commonly used for measuring clinical pain, they tend to lack in reliability or validity when applied to mentally impaired individuals. Expression of pain and/or its biopotential parameters could represent a solution. While such coding systems already exist, they are either very costly and time-consuming, or have been insufficiently evaluated with regards to the theory of mental tests. Building on the experiences made to date, we collected a database using visual and biopotential signals to advance an automated pain recognition system, to determine its theoretical testing quality, and to optimize its performance. For this purpose, participants were subjected to painful heat stimuli under controlled conditions.
international conference on pattern recognition | 2014
Philipp Werner; Ayoub Al-Hamadi; Robert Niese; Steffen Walter; Sascha Gruss; Harald C. Traue
How much does it hurt? Accurate assessment of pain is very important for selecting the right treatment, however current methods are not sufficiently valid and reliable in many cases. Automatic pain monitoring may help by providing an objective and continuous assessment. In this paper we propose an automatic pain recognition system combining information from video and biomedical signals, namely facial expression, head movement, galvanic skin response, electromyography and electrocardiogram. Using the BioVid Heat Pain Database, the system is evaluated in the task of pain detection showing significant improvement over the current state of the art. Further, we discuss the relevance of the modalities and compare person-specific and generic classification models.