Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nikolaus Bee is active.

Publication


Featured researches published by Nikolaus Bee.


perception and interactive technologies | 2008

EmoVoice -- A Framework for Online Recognition of Emotions from Voice

Thurid Vogt; Elisabeth André; Nikolaus Bee

We present EmoVoice, a framework for emotional speech corpus and classifier creation and for offline as well as real-time online speech emotion recognition. The framework is intended to be used by non-experts and therefore comes with an interface to create an own personal or application specific emotion recogniser. Furthermore, we describe some applications and prototypes that already use our framework to track online emotional user states from voice information.


robot and human interactive communication | 2011

Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots

Markus Häring; Nikolaus Bee; Elisabeth André

The ability to display emotions is a key feature in human communication and also for robots that are expected to interact with humans in social environments. For expressions based on Body Movement and other signals than facial expressions, like Sound, no common grounds have been established so far. Based on psychological research on human expression of emotions and perception of emotional stimuli we created eight different expressional designs for the emotions Anger, Sadness, Fear and Joy, consisting of Body Movements, Sounds and Eye Colors. In a large pre-test we evaluated the recognition ratios for the different expressional designs. In our main experiment we separated the expressional designs into their single cues (Body Movement, Sound, Eye Color) and evaluated their expressivity. The detailed view at the perception of our expressional cues, allowed us to evaluate the appropriateness of the stimuli, check our implementations for flaws and build a basis for systematical revision. Our analysis revealed that almost all Body Movements were appropriate for their target emotion and that some of our Sounds need a revision. Eye Colors could be identified as an unreliable component for emotional expression.


perception and interactive technologies | 2008

Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze

Nikolaus Bee; Elisabeth André

We investigate the usability of an eye controlled writing interface that matches the nature of human eye gaze, which always moves and is not immediately able to trigger the selection of a button. Such an interface allows the eye continuously to move and it is not necessary to dwell upon a specific position to trigger a command. We classify writing into three categories (typing, gesturing, and continuous writing) and explain why continuous writing comes closest to the nature of human eye gaze. We propose Quikwriting, which was originally designed for handhelds, as a method for text input that meets the requirements of eye gaze controlled input best. We adapt its design for the usage with eye gaze. Based on the results of a first study, we formulate some guidelines for the design of future Quikwriting-based eye gaze controlled applications.


Ai & Society | 2009

From observation to simulation: generating culture-specific behavior for interactive systems

Matthias Rehm; Yukiko I. Nakano; Elisabeth André; Toyoaki Nishida; Nikolaus Bee; Birgit Endrass; Michael Wissner; Afia Akhter Lipi; Hung-Hsuan Huang

In this article we present a parameterized model for generating multimodal behavior based on cultural heuristics. To this end, a multimodal corpus analysis of human interactions in two cultures serves as the empirical basis for the modeling endeavor. Integrating the results from this empirical study with a well-established theory of cultural dimensions, it becomes feasible to generate culture-specific multimodal behavior in embodied agents by giving evidence for the cultural background of the agent. Two sample applications are presented that make use of the model and are designed to be applied in the area of coaching intercultural communication.


Proceedings of the international workshop on Human-centered multimedia | 2007

Too close for comfort?: adapting to the user's cultural background

Matthias Rehm; Nikolaus Bee; Birgit Endrass; Michael Wissner; Elisabeth André

The cultural context of the user is a largely neglected aspect of human centered computing. This is because culture is a very fuzzy concept and even with a computational model of culture it remains difficult to derive the necessary information to recognize the users cultural background. Such information is only indirectly available and has to be derived from the observable multimodal behavior of the user. We propose the usage of a dimensional model of culture that allows applying computational methods to derive a users cultural background and to adjust the systems behavior accordingly. To this end, a Bayesian network is applied to allow for the necessary inferences despite the fact that the given knowledge about the users behavior is incomplete and unreliable.


intelligent virtual agents | 2009

Breaking the Ice in Human-Agent Communication: Eye-Gaze Based Initiation of Contact with an Embodied Conversational Agent

Nikolaus Bee; Elisabeth André; Susanne Tober

In human-human conversation, the first impression decides whether two people feel attracted by each other and whether contact between them will be continued or not. Starting from psychological work on flirting, we implemented an eye-gaze based model of interaction to investigate whether flirting tactics help improve first encounters between a human and an agent. Unlike earlier work, we concentrate on a very early phase of human-agent conversation (the initiation of contact) and investigate which non-verbal signals an agent should convey in order to create a favourable atmosphere for subsequent interactions and increase the users willingness to engage in an interaction with the agent. To validate our approach, we created a scenario with a realistic 3D agent called Alfred that seeks contact with a human user. Depending on whether the user signals interest in the agent by means of his or her gaze, the agent will finally engage in a conversation or not.


intelligent user interfaces | 2009

Simplified facial animation control utilizing novel input devices: a comparative study

Nikolaus Bee; Bernhard Falk; Elisabeth André

Editing facial expressions of virtual characters is quite a complex task. The face is made up of many muscles, which are partly activated concurrently. Virtual faces with human expressiveness are usually designed with a limited amount of facial regulators. Such regulators are derived from the facial muscle parts that are concurrently activated. Common tools for editing such facial expressions use slider-based interfaces where only a single input at a time is possible. Novel input devices, such as gamepads or data gloves, which allow parallel editing, could not only speed up editing, but also simplify the composition of new facial expressions. We created a virtual face with 23 facial controls and connected it with a slider-based GUI, a gamepad, and a data glove. We first conducted a survey with professional graphics designers to find out how the latter two new input devices would be received in a commercial context. A second comparative study with 17 subjects was conducted to analyze the performance and quality of these two new input devices using subjective and objective measurements.


motion in games | 2011

Individualized agent interactions

Ionut Damian; Birgit Endrass; Peter Huber; Nikolaus Bee; Elisabeth André

Individualized virtual agents can enhance the users perception of a virtual scenario. However, most systems only provide customization for visual features of the characters. In this paper, we describe an approach to individualizing the non-verbal behavior of virtual agents. To this end, we present a software framework which is able to visualize individualized non-verbal behavior. For demonstration purposes, we designed four behavioral profiles that simulate prototypical behaviors for differences in personality and gender. These were then tested in an evaluation study.


affective computing and intelligent interaction | 2009

Relations between facial display, eye gaze and head tilt: Dominance perception variations of virtual agents

Nikolaus Bee; Stefan Franke; Elisabeth Andrea

In this paper, we focus on facial displays, eye gaze and head tilts to express social dominance. In particular, we are interested in the interaction of different non-verbal cues. We present a study which systematically varies eye gaze and head tilts for five basic emotions and a neutral state using our own graphics and animation engine. The resulting images are then presented to a large number of subjects via a Web-based interface who are asked to attribute dominance values to the character shown in the images. First, we analyze how dominance ratings are influenced by the conveyed emotional facial expression. Further, we investigate how gaze direction and head pose influence dominance perception depending on the displayed emotional state.


Multimodal corpora | 2009

Creating standardized video recordings of multimodal interactions across cultures

Matthias Rehm; Elisabeth André; Nikolaus Bee; Birgit Endrass; Michael Wissner; Yukiko I. Nakano; Afia Akhter Lipi; Toyoaki Nishida; Hung-Hsuan Huang

Trying to adapt the behavior of an interactive system to the cultural background of the user requires information on how relevant behaviors differ as a function of the users cultural background. To gain such insights in the interrelation of culture and behavior patterns, the information from the literature is often too anecdotal to serve as the basis for modeling a systems behavior, making it necessary to collect multimodal corpora in a standardized fashion in different cultures. In this chapter, the challenges of such an endeavor are introduced and solutions are presented by examples from a German-Japanese project that aims at modeling culture-specific behaviors for Embodied Conversational Agents.

Collaboration


Dive into the Nikolaus Bee's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thurid Vogt

University of Augsburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Helmut Prendinger

National Institute of Informatics

View shared research outputs
Researchain Logo
Decentralizing Knowledge