Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Egon L. van den Broek is active.

Publication


Featured researches published by Egon L. van den Broek.


biomedical engineering systems and technologies | 2010

Affective Man-Machine Interface: Unveiling Human Emotions through Biosignals

Egon L. van den Broek; Viliam Lisy; Joris H. Janssen; Joyce H. D. M. Westerink; Marleen H. Schut; Kees Tuinenbreijer

As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals.


Musicae Scientiae | 2011

Emotional and psychophysiological responses to tempo, mode, and percussiveness

Marjolein D. van der Zwaag; Joyce H. D. M. Westerink; Egon L. van den Broek

People often listen to music to influence their emotional state. However, the specific musical characteristics which cause this process are not yet fully understood. We have investigated the influence of the musical characteristics of tempo, mode, and percussiveness on our emotions. In a quest towards ecologically valid results, 32 participants listened to 16 pop and 16 rock songs while conducting an office task. They rated experienced arousal, valence, and tension, while skin conductance and cardiovascular responses were recorded. An increase in tempo was found to lead to an increase in reported arousal and tension and a decrease in heart rate variability. More arousal was reported during minor than major mode songs. Level and frequency of skin conductance responses increased with an increase in percussiveness. Physiological responses revealed patterns that might not have been revealed by self-report. Interaction effects further suggest that musical characteristics interplay in modulating emotions. So, tempo, mode, and percussiveness indeed modulate our emotions and, consequently, can be used to direct emotions. Music presentation revealed subtly different results in a laboratory setting, where music was altered with breaks, from those in a more ecologically valid setting where continuous music was presented. All in all, this enhances our understanding of the influence of music on emotions and creates opportunities seamlessly to tap into listeners’ emotional state through their physiological responses.


Applied Ergonomics | 2009

Considerations for emotion-aware consumer products

Egon L. van den Broek; Joyce H. D. M. Westerink

Emotion-aware consumer products require reliable, short-term emotion assessment (i.e., unobtrusive, robust, and lacking calibration). To explore the feasibility of this, an experiment was conducted where the galvanic skin response (GSR) and three electromyography (EMG) signals (frontalis, corrugator supercilii, and zygomaticus major) were recorded on 24 participants who watched eight 2-min emotion inducing film fragments. The unfiltered psychophysiological signals were processed and six statistical parameters (i.e., mean, absolute deviation, standard deviation, variance, skewness, and kurtosis) were derived for each 10-s interval of the film fragment. For each physiological signal, skewness and kurtosis discriminated among affective states, accompanied by other parameters, depending on the signal. The skewness parameter also showed to indicate mixed emotions. Moreover, a mapping of events in the fragments on the signals showed the importance of short-term emotion assessment. Hence, this research identified generic features, denoted important considerations, and illustrated the feasibility of emotion-aware consumer products.


ubiquitous computing | 2013

Ubiquitous emotion-aware computing

Egon L. van den Broek

Emotions are a crucial element for personal and ubiquitous computing. What to sense and how to sense it, however, remain a challenge. This study explores the rare combination of speech, electrocardiogram, and a revised Self-Assessment Mannequin to assess people’s emotions. 40 people watched 30 International Affective Picture System pictures in either an office or a living-room environment. Additionally, their personality traits neuroticism and extroversion and demographic information (i.e., gender, nationality, and level of education) were recorded. The resulting data were analyzed using both basic emotion categories and the valence--arousal model, which enabled a comparison between both representations. The combination of heart rate variability and three speech measures (i.e., variability of the fundamental frequency of pitch (F0), intensity, and energy) explained 90% (p < .001) of the participants’ experienced valence--arousal, with 88% for valence and 99% for arousal (ps < .001). The six basic emotions could also be discriminated (p < .001), although the explained variance was much lower: 18–20%. Environment (or context), the personality trait neuroticism, and gender proved to be useful when a nuanced assessment of people’s emotions was needed. Taken together, this study provides a significant leap toward robust, generic, and ubiquitous emotion-aware computing.


Cyberpsychology, Behavior, and Social Networking | 2009

Navigating through virtual environments: visual realism improves spatial cognition.

Frank Meijer; Branko L. Geudeke; Egon L. van den Broek

Recent advances in computer technology have significantly facilitated the use of virtual environments (VE) for small and medium enterprises (SME). However, achieving visual realism in such VE requires high investments in terms of time and effort, while its usefulness has not yet become apparent from research. Other qualities of VE, such as the use of large displays, proved its effectiveness in enhancing the individual users spatial cognition. The current study assessed whether the same benefits apply for visual realism in VE. Thirty-two participants were divided into two groups, who explored either a photorealistic or a nonrealistic supermarket presented on a large screen. The participants were asked to navigate through the supermarket on a predetermined route. Subsequently, spatial learning was tested in four pen-and-paper tests that assessed how accurately they had memorized the route and the environments spatial layout. The study revealed increased spatial learning from the photorealistic compared to the nonrealistic supermarket. Specifically, participants performed better on tests that involved egocentric spatial knowledge. The results suggest visual realism is useful because it increases the users spatial knowledge in the VE. Therefore, the current study provides clear evidence that it is worthwhile for SME to invest in achieving visual realism in VE.


User Modeling and User-adapted Interaction | 2012

Tune in to your emotions: a robust personalized affective music player

Joris H. Janssen; Egon L. van den Broek; Joyce H. D. M. Westerink

The emotional power of music is exploited in a personalized affective music player (AMP) that selects music for mood enhancement. A biosignal approach is used to measure listeners’ personal emotional reactions to their own music as input for affective user models. Regression and kernel density estimation are applied to model the physiological changes the music elicits. Using these models, personalized music selections based on an affective goal state can be made. The AMP was validated in real-world trials over the course of several weeks. Results show that our models can cope with noisy situations and handle large inter-individual differences in the music domain. The AMP augments music listening where its techniques enable automated affect guidance. Our approach provides valuable insights for affective computing and user modeling, for which the AMP is a suitable carrier application.


Human-Computer Interaction | 2013

Machines Outperform Laypersons in Recognizing Emotions Elicited by Autobiographical Recollection

Joris H. Janssen; Paul Tacken; J. J. G. Gert-Jan de Vries; Egon L. van den Broek; Joyce H. D. M. Westerink; Pim Haselager; Wa Wijnand IJsselsteijn

Over the last decade, an increasing number of studies have focused on automated recognition of human emotions by machines. However, performances of machine emotion recognition studies are difficult to interpret because benchmarks have not been established. To provide such a benchmark, we compared machine with human emotion recognition. We gathered facial expressions, speech, and physiological signals from 17 individuals expressing 5 different emotional states. Support vector machines achieved an 82% recognition accuracy based on physiological and facial features. In experiments with 75 humans on the same data, a maximum recognition accuracy of 62.8% was obtained. As machines outperformed humans, automated emotion recognition might be ready to be tested in more practical applications.


ambient intelligence | 2009

Unobtrusive Sensing of Emotions (USE)

Egon L. van den Broek; Marleen H. Schut; Joyce H. D. M. Westerink; Kees Tuinenbreijer

Emotions are acknowledged as a crucial element for artificial intelligence; this is, as is illustrated, no different for Ambient Intelligence (AmI). Unobtrusive Sensing of Emotions (USE) is introduced to enrich AmI with empathic abilities. USE coins the combination of speech and the electrocardiogram (ECG) as a powerful and unique combination to unravel peoples emotions. In a controlled study, 40 people watched film scenes, in either an office or a home-like setting. It is shown that, when peoples gender is taken into account, both heart rate variability (derived from the ECG) and the standard deviation of the fundamental frequency of speech indicate peoples experienced valence and arousal, in parallel. As such, both measures validate each other. Thus, through USE reliable cues can be derived that indicate the emotional state of people, in particular when also peoples environment is taken into account. Since all this is crucial for both AI and true AmI, this study provides a first significant leap forward in making AmI a success.


international conference on computer vision | 2006

Computing emotion awareness through facial electromyography

Egon L. van den Broek; Marleen H. Schut; Joyce H. D. M. Westerink; Jan van Herk; Kees Tuinenbreijer

To improve human-computer interaction (HCI), computers need to recognize and respond properly to their users emotional state. This is a fundamental application of affective computing, which relates to, arises from, or deliberately influences emotion. As a first step to a system that recognizes emotions of individual users, this research focuses on how emotional experiences are expressed in six parameters (i.e., mean, absolute deviation, standard deviation, variance, skewness, and kurtosis) of physiological measurements of three electromyography signals: frontalis (EMG1), corrugator supercilii (EMG2), and zygomaticus major (EMG3). The 24 participants were asked to watch film scenes of 120 seconds, which they rated afterward. These ratings enabled us to distinguish four categories of emotions: negative, positive, mixed, and neutral. The skewness of the EMG2 and four parameters of EMG3, discriminate between the four emotion categories. This, despite the coarse time windows that were used. Moreover, rapid processing of the signals proved to be possible. This enables tailored HCI facilitated by an emotional awareness of systems.


affective computing and intelligent interaction | 2009

Personalized affective music player

Joris H. Janssen; Egon L. van den Broek; Joyce H. D. M. Westerink

We introduce and test an affective music player (AMP) that selects music for mood enhancement. Through a concise overview of content, construct, and ecological validity, we elaborate five considerations that form the foundation of the AMP. Based on these considerations, computational models are developed, using regression and kernel density estimation. We show how these models can be used for music selection and how they can be extended to fit in other systems. Subsequently, the success of the models is illustrated with a user test. The AMP augments music listening, where its techniques, in general, enable automated affect guidance. Finally, we argue that our AMP is readily applicable to real-world situations as it can 1) cope with noisy situations, 2) handle the large inter-individual differences apparent in the musical domain, and 3) integrate context or other information, all in real-time.

Collaboration


Dive into the Egon L. van den Broek's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Theo E. Schouten

Radboud University Nijmegen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ton Dijkstra

Radboud University Nijmegen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thijs Kok

Radboud University Nijmegen

View shared research outputs
Researchain Logo
Decentralizing Knowledge