Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael L. Walters is active.

Publication


Featured researches published by Michael L. Walters.


intelligent robots and systems | 2005

What is a robot companion - friend, assistant or butler?

Kerstin Dautenhahn; Sarah Woods; Christina Kaouri; Michael L. Walters; Kheng Lee Koay; Iain Werry

The study presented in this paper explored peoples perceptions and attitudes towards the idea of a future robot companion for the home. A human-centred approach was adopted using questionnaires and human-robot interaction trials to derive data from 28 adults. Results indicated that a large proportion of participants were in favour of a robot companion and saw the potential role as being an assistant, machine or servant. Few wanted a robot companion to be a friend. Household tasks were preferred to child/animal care tasks. Humanlike communication was desirable for a robot companion, whereas humanlike behaviour and appearance were less essential. Results are discussed in relation to future research directions for the development of robot companions.


Autonomous Robots | 2008

Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion

Michael L. Walters; Dag Sverre Syrdal; Kerstin Dautenhahn; I. René J. A. te Boekhorst; Kheng Lee Koay

Abstract This article presents the results of video-based Human Robot Interaction (HRI) trials which investigated people’s perceptions of different robot appearances and associated attention-seeking features and behaviors displayed by robots with different appearance and behaviors. The HRI trials studied the participants’ preferences for various features of robot appearance and behavior, as well as their personality attributions towards the robots compared to their own personalities. Overall, participants tended to prefer robots with more human-like appearance and attributes. However, systematic individual differences in the dynamic appearance ratings are not consistent with a universal effect. Introverts and participants with lower emotional stability tended to prefer the mechanical looking appearance to a greater degree than other participants. It is also shown that it is possible to rate individual elements of a particular robot’s behavior and then assess the contribution, or otherwise, of that element to the overall perception of the robot by people. Relating participants’ dynamic appearance ratings of individual robots to independent static appearance ratings provided evidence that could be taken to support a portion of the left hand side of Mori’s theoretically proposed ‘uncanny valley’ diagram. Suggestions for future work are outlined.


Applied Bionics and Biomechanics | 2009

KASPAR --a minimally expressive humanoid robot for human--robot interaction research

Kerstin Dautenhahn; Chrystopher L. Nehaniv; Michael L. Walters; Ben Robins; Hatice Kose-Bagci; Mike Blow

This paper provides a comprehensive introduction to the design of the minimally expressive robot KASPAR, which is particularly suitable for human--robot interaction studies. A low-cost design with off-the-shelf components has been used in a novel design inspired from a multi-disciplinary viewpoint, including comics design and Japanese Noh theatre. The design rationale of the robot and its technical features are described in detail. Three research studies will be presented that have been using KASPAR extensively. Firstly, we present its application in robot-assisted play and therapy for children with autism. Secondly, we illustrate its use in human--robot interaction studies investigating the role of interaction kinesics and gestures. Lastly, we describe a study in the field of developmental robotics into computational architectures based on interaction histories for robot ontogeny. The three areas differ in the way as to how the robot is being operated and its role in social interaction scenarios. Each will be introduced briefly and examples of the results will be presented. Reflections on the specific design features of KASPAR that were important in these studies and lessons learnt from these studies concerning the design of humanoid robots for social interaction will also be discussed. An assessment of the robot in terms of utility of the design for human--robot interaction experiments concludes the paper.


robot and human interactive communication | 2006

Methodological Issues in HRI: A Comparison of Live and Video-Based Methods in Robot to Human Approach Direction Trials

Sarah Woods; Michael L. Walters; Kheng Lee Koay; Kerstin Dautenhahn

The main aim of this study was to confirm the findings from previous pilot studies that results obtained from the same human robot interaction (HRI) scenarios in trials using both video-based and live methodologies were comparable. We investigated how a robot should approach human subjects in various scenarios relevant to the robot fetching an object for the subject. These scenarios include a human subject sitting in an open space, sitting at a table, standing in an open space and standing against a wall. The subjects experienced the robot approaching from various directions for each of these contexts in HRI trials that were both live and video-based. There was a high degree of agreement between the results obtained from both the live and video based trials using the same scenarios. The main findings from both types of trial methodology were: Humans strongly did not like a direct frontal approach by a robot, especially while sitting (even at a table) or while standing with their back to a wall. An approach from the front left or front right was preferred. When standing in an open space a frontal approach was more acceptable and although a rear approach was not usually most preferred, it was generally acceptable to subjects if physically more convenient


human-robot interaction | 2007

Robotic etiquette: results from user studies involving a fetch and carry task

Michael L. Walters; Kerstin Dautenhahn; Sarah Woods; Kheng Lee Koay

This paper presents results, outcomes and conclusions from a series of Human Robot Interaction (HRI) trials which investigated how a robot should approach a human in a fetch and carry task. Two pilot trials were carried out, aiding the development of a main HRI trial with four different approach contexts under controlled experimental conditions. The findings from the pilot trials were confirmed and expanded upon. Most subjects disliked a frontal approach when seated. In general, seated humans do not like to be approached by a robot directly from the front even when seated behind a table. A frontal approach is more acceptable when a human is standing in an open area. Most subjects preferred to be approached from either the left or right side, with a small overall preference for a right approach by the robot. However, this is not a strong preference and it may be disregarded if it is more physically convenient to approach from a left front direction. Handedness and occupation were not related to these preferences. Subjects do not usually like the robot to move or approach from directly behind them, preferring the robot to be in view even if this means the robot taking a physically non-optimum path. The subjects for the main HRI trials had no previous experience of interacting with robots. Future research aims are outlined and include the necessity of carrying out longitudinal trials to see if these findings hold over a longer period of exposure to robots.


robot and human interactive communication | 2011

A long-term Human-Robot Proxemic study

Michael L. Walters; Mohammadreza Asghari Oskoei; Dag Sverre Syrdal; Kerstin Dautenhahn

A long-term Human-Robot Proxemic (HRP) study was performed using a newly developed Autonomous Proxemic System (APS) for a robot to measure and control the approach distances to the human participants. The main findings were that most HRP adaptation occurred in the first two interaction sessions, and for the remaining four weeks, approach distance preferences remained relatively steady, apart from some short periods of increased distances for some participants. There were indications that these were associated with episodes where the robot malfunctioned, so this raises the possibility of users trust in the robot affecting HRP distance. The study also found that approach distances for humans approaching the robot and the robot approaching the human were comparable, though there were indications that humans preferred to approach the robot more closely than they allowed the robot to approach them in a physically restricted area. Two participants left the study prematurely, stating they were bored with the repetitive experimental procedures. This highlights issues related to the often incompatible demands of keeping experimental controlled conditions vs. having realistic, engaging and varied HRI trial scenarios.


Artificial Life | 2013

Hey! There is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent

Kheng Lee Koay; Gabriella Lakatos; Dag Sverre Syrdal; Márta Gácsi; Boróka Bereczky; Kerstin Dautenhahn; Ádám Miklósi; Michael L. Walters

This paper presents a study of the readability of dog-inspired visual communication signals in a human-robot interaction scenario. This study was motivated by specially trained hearing dogs which provide assistance to their deaf owners by using visual communication signals to lead them to the sound source. For our human-robot interaction scenario, a robot was used in place of a hearing dog to lead participants to two different sound sources. The robot was preprogrammed with dog-inspired behaviors, controlled by a wizard who directly implemented the dog behavioral strategy on the robot during the trial. By using dog-inspired visual communication signals as a means of communication, the robot was able to lead participants to the sound sources (the microwave door, the front door). Findings indicate that untrained participants could correctly interpret the robots intentions. Head movements and gaze directions were important for communicating the robots intention using visual communication signals.


robot and human interactive communication | 2005

Hey, I'm over here - How can a robot attract people's attention?

Markus Finke; Kheng Lee Koay; Kerstin Dautenhahn; Chrystopher L. Nehaniv; Michael L. Walters; Joe Saunders

This paper describes how sonar sensors can be used to recognize human movements. The robot distinguishes objects from humans by assuming that only people move by themselves. Two methods using either rules or hidden Markov models are described. The robot classifies different movements to provide a basis for judging if a person is interested in an interaction. A comparison of two experiment results is presented. The use of orienting cues by the robot in response to detected human movement for eliciting interaction is also studied.


robot and human interactive communication | 2010

Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot

Dag Sverre Syrdal; Kheng Lee Koay; Márta Gácsi; Michael L. Walters; Kerstin Dautenhahn

This paper presents results from a video human-robot interaction (VHRI) study in which participants viewed a video in which an appearance-constrained Pioneer robot used dog-inspired affective cues to communicate affinity and relationship with its owner and a guest using proxemics, body movement and orientation and camera orientation. The findings suggest that even with the limited modalities for non-verbal expression offered by a Pioneer robot, which does not have a dog-like appearance, these cues were effective for non-verbal affective communication.


robot and human interactive communication | 2008

Evaluating extrovert and introvert behaviour of a domestic robot — a video study

Manja Lohse; Marc Hanheide; Britta Wrede; Michael L. Walters; Kheng Lee Koay; Dag Sverre Syrdal; Anders Green; Helge Hüttenrauch; Kerstin Dautenhahn; Gerhard Sagerer; Kerstin Severinson-Eklundh

Human-robot interaction (HRI) research is here presented into social robots that have to be able to interact with inexperienced users. In the design of these robots many research findings of human-human interaction and human-computer interaction are adopted but the direct applicability of these theories is limited because a robot is different from both humans and computers. Therefore, new methods have to be developed in HRI in order to build robots that are suitable for inexperienced users. In this paper we present a video study we conducted employing our robot BIRON (Bielefeld robot companion) which is designed for use in domestic environments. Subjects watched the system during the interaction with a human and rated two different robot behaviours (extrovert and introvert). The behaviours differed regarding verbal output and person following of the robot. Aiming to improve human-robot interaction, participantspsila ratings of the behaviours were evaluated and compared.

Collaboration


Dive into the Michael L. Walters's collaboration.

Top Co-Authors

Avatar

Kerstin Dautenhahn

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Kheng Lee Koay

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Dag Sverre Syrdal

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Sarah Woods

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joe Saunders

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Alessandra Rossi

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Alex May

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Anna Dumitriu

University of Hertfordshire

View shared research outputs
Researchain Logo
Decentralizing Knowledge