Patrick Holthaus
Bielefeld University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Patrick Holthaus.
International Journal of Social Robotics | 2011
Patrick Holthaus; Karola Pitsch; Sven Wachsmuth
Social interaction between humans takes place in the spatial environment on a daily basis. We occupy space for ourselves and respect the dynamics of spaces that are occupied by others. In human-robot interaction, spatial models are commonly used for structuring relatively far-away interactions or passing-by scenarios. This work instead, focuses on the transition between distant and close communication for an interaction opening. We applied a spatial model to a humanoid robot and implemented an attention system that is connected to it. The resulting behaviors have been verified in an online video study. The questionnaire revealed that these behaviors are applicable and result in a robot that has been perceived as more interested in the human and shows its attention and intentions earlier and to a higher degree than other strategies.
robot and human interactive communication | 2011
Frank Hegel; Sebastian Gieselmann; Annika Peters; Patrick Holthaus; Britta Wrede
In this paper, we present a first step towards a typology of relevant signals and cues in human-robot interaction (HRI). In human as well as in animal communication systems, signals and cues play an important role for senders and receivers of such signs. In our typology, we systematically distinguish between a robots signals and cues which are either designed to be human-like or artificial to create meaningful information. Subsequently, developers and designers should be aware of which signs affect a users judgements on social robots. For this reason, we first review several signals and cues that have already been successfully used in HRI with regard to our typology. Second, we discuss crucial human-like and artificial cues which have so far not been considered in the design of social robots - although they are highly likely to affect a users judgement of social robots.
international conference on social robotics | 2010
Patrick Holthaus; Ingo Lütkebohle; Marc Hanheide; Sven Wachsmuth
Social interaction between humans takes place in the spatial dimension on a daily basis. We occupy space for ourselves and respect the dynamics of spaces that are occupied by others. In human-robot interaction, the focus has been on other topics so far. Therefore, this work applies a spatial model to a humanoid robot and implements an attention system that is connected to it. The resulting behaviors have been verified in an on-line video study. The questionnaire revealed that these behaviors are applicable and result in a robot that has been perceived as more interested in the human and shows its attention and intentions to a higher degree.
Künstliche Intelligenz | 2017
Sebastian Wrede; Christian Leichsenring; Patrick Holthaus; Thomas Hermann; Sven Wachsmuth
The emergence of cognitive interaction technology offering intuitive and personalized support for humans in daily routines is essential for the success of future smart environments. Social robotics and ambient assisted living are well-established, active research fields but in the real world the number of smart environments that support humans efficiently on a daily basis is still rather low. We argue that research on ambient intelligence and human–robot interaction needs to be conducted in a strongly interdisciplinary process to facilitate seamless integration of assistance technologies into the users daily lives. With the cognitive service robotics apartment (CSRA), we are developing a novel kind of laboratory following this interdisciplinary approach. It combines a smart home with ambient intelligence functionalities with a cognitive social robot with advanced manipulation capabilities to explore the all day use of cognitive interaction technology for human assistance. This lab in conjunction with our development approach opens up new lines of inquiry and allows us to address new research questions in human–machine, human–agent and human–robot interaction
international conference on social robotics | 2016
Jasmin Bernotat; Birte Schiffhauer; Friederike Anne Eyssel; Patrick Holthaus; Christian Leichsenring; Viktor Richter; Marian Pohling; Birte Carlmeyer; Norman Köster; Sebastian Meyer zu Borgsen; René Zorn; Kai Frederic Engelmann; Florian Lier; Simon Schulz; Rebecca Bröhl; Elena Seibel; Paul Hellwig; Philipp Cimiano; Franz Kummert; David Schlangen; Petra Wagner; Thomas Hermann; Sven Wachsmuth; Britta Wrede; Sebastian Wrede
The purpose of this Wizard-of-Oz study was to explore the intuitive verbal and non-verbal goal-directed behavior of naive participants in an intelligent robotics apartment. Participants had to complete seven mundane tasks, for instance, they were asked to turn on the light. Participants were explicitly instructed to consider nonstandard ways of completing the respective tasks. A multi-method approach revealed that most participants favored speech and interfaces like switches and screens to communicate with the intelligent robotics apartment. However, they required instructions to use the interfaces in order to perceive them as competent targets for human-machine interaction. Hence, first important steps were taken to investigate how to design an intelligent robotics apartment in a user-centered and user-friendly manner.
ieee-ras international conference on humanoid robots | 2012
Patrick Holthaus; Sven Wachsmuth
In face-to-face interaction, humans coordinate actions in their surroundings with the help of a well structured spatial representation. For example on a dinner table, everybody exactly knows which objects belong to her and where she is allowed to grasp. To have robots, e.g. receptionists, act accordingly, we conducted an on-line survey about the expectations humans have while interacting with such a robot. Results indicate that humans attribute the robot handedness and an awareness of distance and territoriality in its own peripersonal space. In order to align a robots behavior to these expectations, we have have developed a first spatial representation of the robots peripersonal space.
human-robot interaction | 2014
Patrick Holthaus; Sven Wachsmuth
In this demonstration, a humanoid robot interacts with an interlocutor through speech and gestures in order to give directions on a map. The interaction is specifically designed to provide an enhanced user experience by being aware of non-verbal social signals. Therefore, we take spatial communicative cues into account and to react to them accordingly. Categories and Subject Descriptors I.2.9 [Artificial Intelligence]: Robotics— human-robot interaction General Terms Design, Human Factors
robot and human interactive communication | 2013
Patrick Holthaus; Sven Wachsmuth
This work-in-progress paper presents an on-line system for robotic heads capable of mimicking humans. The marker-less method solely depends on the interactants face as an input and does not use a set of basic emotions and is thus capable of displaying a large variety of facial expressions. A preliminary evaluation assigns solid performance with potential for improvement.
human-agent interaction | 2017
Timo Michalski; Marian Pohling; Patrick Holthaus
Technologies that aim to achieve intelligent automation in smart homes typically involve either trigger-action pairs or machine learning. These, however, are often complex to configure or hard to comprehend for the user. To maximize automation efficiency while keeping the configuration simple and the effects comprehensible, we thus explore an alternative agent-based approach. With the help of a survey, we put together a set of intelligent agents that act autonomously in the environment. Conflicts between behaviors, identified with a secondary study, are thereby resolved with a competitive combination of agents. We finally present the draft of a user interface that allows for individual configuration of all agents.
international conference on multimodal interfaces | 2016
Patrick Holthaus; Thomas Hermann; Sebastian Wrede; Sven Wachsmuth; Britta Wrede
The first workshop on embodied interaction with smart environments aims to bring together the very active community of multi-modal interaction research and the rapidly evolving field of smart home technologies. Besides addressing the software architecture of such very complex systems, it puts an emphasis on questions regarding an intuitive interaction with the environment. Thereby, especially the role of agency leads to interesting challenges in the light of user interactions. We therefore encourage a lively discussion on the design and concepts of social robots and virtual avatars as well as innovative ambient devices and their implementation into smart environments.