Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Felix Kistler is active.

Publication


Featured researches published by Felix Kistler.


acm multimedia | 2013

The social signal interpretation (SSI) framework: multimodal signal processing and recognition in real-time

Johannes Wagner; Florian Lingenfelser; Tobias Baur; Ionut Damian; Felix Kistler; Elisabeth André

Automatic detection and interpretation of social signals carried by voice, gestures, mimics, etc. will play a key-role for next-generation interfaces as it paves the way towards a more intuitive and natural human-computer interaction. The paper at hand introduces Social Signal Interpretation (SSI), a framework for real-time recognition of social signals. SSI supports a large range of sensor devices, filter and feature algorithms, as well as, machine learning and pattern recognition tools. It encourages developers to add new components using SSIs C++ API, but also addresses front end users by offering an XML interface to build pipelines with a text editor. SSI is freely available under GPL at http://openssi.net.


Journal on Multimodal User Interfaces | 2012

Natural interaction with culturally adaptive virtual characters

Felix Kistler; Birgit Endrass; Ionut Damian; Chi Tai Dang; Elisabeth André

Recently, the verbal and non-verbal behavior of virtual characters has become more and more sophisticated due to advances in behavior planning and rendering. Nevertheless, the appearance and behavior of these characters is in most cases based on the cultural background of their designers. Especially in combination with new natural interaction interfaces, there is the risk that characters developed for a particular culture might not find acceptance when being presented to another culture. A few attempts have been made to create characters that reflect a particular cultural background. However, interaction with these characters still remains an awkward experience in particular when it comes to non-verbal interaction. In many cases, human users either have to choose actions from a menu their avatar has to execute or they have to struggle with obtrusive interaction devices. In contrast, our paper combines an approach to the generation of culture-specific behaviors with full body avatar control based on the Kinect sensor. A first study revealed that users are able to easily control an avatar through their body movements and immediately adapt its behavior to the cultural background of the agents they interact with.


international conference on social robotics | 2012

User-defined body gestures for navigational control of a humanoid robot

Mohammad Obaid; Markus Häring; Felix Kistler; René Bühling; Elisabeth André

This paper presents a study that allows users to define intuitive gestures to navigate a humanoid robot. For eleven navigational commands, 385 gestures, performed by 35 participants, were analyzed. The results of the study reveal user-defined gesture sets for both novice users and expert users. In addition, we present, a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, time performances of the gesture motions, and present implications to the design of the robot control, with a focus on recognition and user interfaces.


international conference on interactive digital storytelling | 2011

Full body gestures enhancing a game book for interactive story telling

Felix Kistler; Dominik Sollfrank; Nikolaus Bee; Elisabeth André

Game Books can offer a well-written, but non-linear story, as readers always have to decide, how to continue after reading a text passage. It seems very logical to adopt such a book to investigate interaction paradigms for an interactive storytelling scenario. Nevertheless, it is not easy to keep the player motivated during a long-winded narrated story until the next point of intervention is reached. In this paper we tested different methods of implementing the decision process in such a scenario using speech input and tested it with 26 participants during a two player scenario. This revealed that with an omitted on-screen prompt the application was less easy to use, but caused considerably more user interaction. We further added additional interactivity with so-called Quick Time Events (QTEs). In these events, the player has a limited amount of time to perform a specific action after a corresponding prompt appeares on screen. Different versions of QTEs were implemented using Full Body Tracking with Microsoft Kinect, and were tested with another 18 participants during a two player scenario. We found that Full Body Gestures were easier to perform and, in general, preferred to controlling a cursor with one hand and hitting buttons with it.


Proceedings of the 20th International Academic Mindtrek Conference on | 2016

How would you gesture navigate a drone?: a user-centered approach to control a drone

Mohammad Obaid; Felix Kistler; Gabrielė Kasparavičiūtė; Asim Evren Yantaç; Morten Fjeld

Gestural interaction with flying drones is now on the rise; however, little work has been done to reveal the gestural preferences from users directly. In this paper, we present an elicitation study to help in realizing user-defined gestures for drone navigation. We apply a user-centered approach in which we collected data from 25 participants performing gestural interactions for twelve drone actions of which ten are navigational actions. The analyses of 300 gesture data collected from our participants reveal a user-defined gestural set of possible suitable gestures to control a drone. We report results that can be used by software developers, engineers or designers; and included a taxonomy for the set of user-defined gestures, gestural agreement scores, time performances and subjective ratings for each action. Finally, we discuss the gestural set with implementation insights and conclude with future directions.


intelligent virtual agents | 2012

Cultural behaviors of virtual agents in an augmented reality environment

Mohammad Obaid; Ionut Damian; Felix Kistler; Birgit Endrass; Johannes Wagner; Elisabeth André

This paper presents a pilot evaluation study that investigates the physiological response of users when interacting with virtual agents that resemble cultural behaviors in an Augmented Reality environment. In particular, we analyze users from the Arab and German cultural backgrounds. The initial results of our analysis are promising and show that users tend to have a higher physiological arousal towards virtual agents that do not exhibit behaviors of their cultural background.


international conference on human-computer interaction | 2013

User-Defined Body Gestures for an Interactive Storytelling Scenario

Felix Kistler; Elisabeth André

For improving full body interaction in an interactive storytelling scenario, we conducted a study to get a user-defined gesture set. 22 users performed 251 gestures while running through the story script with real interaction disabled, but with hints of what set of actions was currently requested by the application. We describe our interaction design process, starting with the conduction of the study, continuing with the analysis of the recorded data including the creation of gesture taxonomy and the selection of gesture candidates, and ending with the integration of the gestures in our application.


international conference on human-computer interaction | 2013

Traveller: An Interactive Cultural Training System Controlled by User-Defined Body Gestures

Felix Kistler; Elisabeth André; Samuel Mascarenhas; André Silva; Ana Paiva; Nick Degens; Gert Jan Hofstede; Eva Krumhuber; Arvid Kappas; Ruth Aylett

In this paper, we describe a cultural training system based on an interactive storytelling approach and a culturally-adaptive agent architecture, for which a user-defined gesture set was created. 251 full body gestures by 22 users were analyzed to find intuitive gestures for the in-game actions in our system. After the analysis we integrated the gestures in our application using our framework for full body gesture recognition. We further integrated a second interaction type which applies a graphical interface controlled with freehand swiping gestures.


augmented human international conference | 2013

Augmented reality using a 3D motion capturing suit

Ionut Damian; Mohammad Obaid; Felix Kistler; Elisabeth André

In the paper, we propose an approach that immerses the human user in an Augmented Reality (AR) environment with the use of an inertial motion capturing suit and a Head Mounted Displays system. The proposed approach allows for full body interaction with the AR environment in real-time and it does not require the use of any markers or cameras.


international conference on human-computer interaction | 2014

Effects of Language Variety on Personality Perception in Embodied Conversational Agents

Brigitte Krenn; Birgit Endrass; Felix Kistler; Elisabeth André

In this paper, we investigate the effects of language variety in combination with bodily behaviour on the perceived personality of a virtual agent. In particular, we explore changes on the extroversion-introversion dimension of personality. An online perception study was conducted featuring a virtual character with different levels of expressive body behaviour and different synthetic voices representing German and Austrian language varieties. Clear evidence was found that synthesized language variety, and gestural expressivity influence the human perception of an agent’s extroversion. Whereby Viennese and Austrian standard language are perceived as more extrovert than it is the case for the German standard.

Collaboration


Dive into the Felix Kistler's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

André Silva

Instituto Superior Técnico

View shared research outputs
Top Co-Authors

Avatar

Samuel Mascarenhas

Instituto Superior Técnico

View shared research outputs
Top Co-Authors

Avatar

Ruth Aylett

Heriot-Watt University

View shared research outputs
Top Co-Authors

Avatar

Gert Jan Hofstede

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

Arvid Kappas

Jacobs University Bremen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nick Degens

Wageningen University and Research Centre

View shared research outputs
Researchain Logo
Decentralizing Knowledge