Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kheng Lee Koay is active.

Publication


Featured researches published by Kheng Lee Koay.


intelligent robots and systems | 2005

What is a robot companion - friend, assistant or butler?

Kerstin Dautenhahn; Sarah Woods; Christina Kaouri; Michael L. Walters; Kheng Lee Koay; Iain Werry

The study presented in this paper explored peoples perceptions and attitudes towards the idea of a future robot companion for the home. A human-centred approach was adopted using questionnaires and human-robot interaction trials to derive data from 28 adults. Results indicated that a large proportion of participants were in favour of a robot companion and saw the potential role as being an assistant, machine or servant. Few wanted a robot companion to be a friend. Household tasks were preferred to child/animal care tasks. Humanlike communication was desirable for a robot companion, whereas humanlike behaviour and appearance were less essential. Results are discussed in relation to future research directions for the development of robot companions.


human-robot interaction | 2006

How may I serve you?: a robot companion approaching a seated person in a helping context

Kerstin Dautenhahn; Mick L. Walters; Sarah Woods; Kheng Lee Koay; Chrystopher L. Nehaniv; A. Sisbot; Rachid Alami; Thierry Siméon

This paper presents the combined results of two studies that investigated how a robot should best approach and place itself relative to a seated human subject. Two live Human Robot Interaction (HRI) trials were performed involving a robot fetching an object that the human had requested, using different approach directions. Results of the trials indicated that most subjects disliked a frontal approach, except for a small minority of females, and most subjects preferred to be approached from either the left or right side, with a small overall preference for a right approach by the robot. Handedness and occupation were not related to these preferences. We discuss the results of the user studies in the context of developing a path planning system for a mobile robot.


Autonomous Robots | 2008

Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion

Michael L. Walters; Dag Sverre Syrdal; Kerstin Dautenhahn; I. René J. A. te Boekhorst; Kheng Lee Koay

Abstract This article presents the results of video-based Human Robot Interaction (HRI) trials which investigated people’s perceptions of different robot appearances and associated attention-seeking features and behaviors displayed by robots with different appearance and behaviors. The HRI trials studied the participants’ preferences for various features of robot appearance and behavior, as well as their personality attributions towards the robots compared to their own personalities. Overall, participants tended to prefer robots with more human-like appearance and attributes. However, systematic individual differences in the dynamic appearance ratings are not consistent with a universal effect. Introverts and participants with lower emotional stability tended to prefer the mechanical looking appearance to a greater degree than other participants. It is also shown that it is possible to rate individual elements of a particular robot’s behavior and then assess the contribution, or otherwise, of that element to the overall perception of the robot by people. Relating participants’ dynamic appearance ratings of individual robots to independent static appearance ratings provided evidence that could be taken to support a portion of the left hand side of Mori’s theoretically proposed ‘uncanny valley’ diagram. Suggestions for future work are outlined.


robot and human interactive communication | 2006

Methodological Issues in HRI: A Comparison of Live and Video-Based Methods in Robot to Human Approach Direction Trials

Sarah Woods; Michael L. Walters; Kheng Lee Koay; Kerstin Dautenhahn

The main aim of this study was to confirm the findings from previous pilot studies that results obtained from the same human robot interaction (HRI) scenarios in trials using both video-based and live methodologies were comparable. We investigated how a robot should approach human subjects in various scenarios relevant to the robot fetching an object for the subject. These scenarios include a human subject sitting in an open space, sitting at a table, standing in an open space and standing against a wall. The subjects experienced the robot approaching from various directions for each of these contexts in HRI trials that were both live and video-based. There was a high degree of agreement between the results obtained from both the live and video based trials using the same scenarios. The main findings from both types of trial methodology were: Humans strongly did not like a direct frontal approach by a robot, especially while sitting (even at a table) or while standing with their back to a wall. An approach from the front left or front right was preferred. When standing in an open space a frontal approach was more acceptable and although a rear approach was not usually most preferred, it was generally acceptable to subjects if physically more convenient


human-robot interaction | 2007

Robotic etiquette: results from user studies involving a fetch and carry task

Michael L. Walters; Kerstin Dautenhahn; Sarah Woods; Kheng Lee Koay

This paper presents results, outcomes and conclusions from a series of Human Robot Interaction (HRI) trials which investigated how a robot should approach a human in a fetch and carry task. Two pilot trials were carried out, aiding the development of a main HRI trial with four different approach contexts under controlled experimental conditions. The findings from the pilot trials were confirmed and expanded upon. Most subjects disliked a frontal approach when seated. In general, seated humans do not like to be approached by a robot directly from the front even when seated behind a table. A frontal approach is more acceptable when a human is standing in an open area. Most subjects preferred to be approached from either the left or right side, with a small overall preference for a right approach by the robot. However, this is not a strong preference and it may be disregarded if it is more physically convenient to approach from a left front direction. Handedness and occupation were not related to these preferences. Subjects do not usually like the robot to move or approach from directly behind them, preferring the robot to be in view even if this means the robot taking a physically non-optimum path. The subjects for the main HRI trials had no previous experience of interacting with robots. Future research aims are outlined and include the necessity of carrying out longitudinal trials to see if these findings hold over a longer period of exposure to robots.


robot and human interactive communication | 2008

Human approach distances to a mechanical-looking robot with different robot voice styles

Mick L. Walters; Dag Sverre Syrdal; Kheng Lee Koay; Kerstin Dautenhahn; R. Te Boekhorst

Findings are presented from a Human Robot Interaction (HRI) Demonstration Trial where attendees approached a stationary mechanical looking robot to a comfortable distance. Instructions were given to participants by the robot using either a high quality male, a high quality female, a neutral synthesized voice, or by the experimenter (no robot voice). Approaches to the robot with synthesized voice were found to induce significantly further approach distances. Those who had experienced a previous encounter with the robot tended to approach closer to the robot. Possible reasons for this are discussed.


ieee-ras international conference on humanoid robots | 2005

Is this robot like me? Links between human and robot personality traits

Sarah Woods; Kerstin Dautenhahn; Christina Kaouri; Rene te Boekhorst; Kheng Lee Koay

A relatively unexplored question for human-robot social interaction is whether a robots personality should match that of the human user, or be different in the sense that humans do not want the robot to be like them. In this study, 28 adults interacted individually with a non-humanoid robot that demonstrated two robot behaviour styles (socially interactive, socially ignorant) in a simulated living room situation. Questionnaires assessed the extent to which adult ratings of their own personality traits were similar or different to the two robot behaviours. Results revealed that overall subjects did not view their own personality as similar to either of the two robot behaviour styles. Subjects viewed themselves as having stronger personality characteristics compared to the two robot behaviour styles. Important group differences were found, factors such as subject gender, age and technological experience were important in how subjects viewed their personality as being similar to the robot personality. Design implications for future studies are discussed


Connection Science | 2006

Exploratory studies on social spaces between humans and a mechanical-looking robot

Mick L. Walters; Kerstin Dautenhahn; Sarah Woods; Kheng Lee Koay; R. Te Boekhorst; David Lee

The results from two empirical studies of human–robot interaction are presented. The first study involved the subject approaching the static robot and the robot approaching the standing subject. In these trials a small majority of subjects preferred a distance corresponding to the ‘personal zone’ typically used by humans when talking to friends. However, a large minority of subjects got significantly closer, suggesting that they treated the robot differently from a person, and possibly did not view the robot as a social being. The second study involved a scenario where the robot fetched an object that the seated subject had requested, arriving from different approach directions. The results of this second trial indicated that most subjects disliked a frontal approach. Most subjects preferred to be approached from either the left or right side, with a small overall preference for a right approach by the robot. Implications for future work are discussed.


Artificial Life | 2013

Hey! There is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent

Kheng Lee Koay; Gabriella Lakatos; Dag Sverre Syrdal; Márta Gácsi; Boróka Bereczky; Kerstin Dautenhahn; Ádám Miklósi; Michael L. Walters

This paper presents a study of the readability of dog-inspired visual communication signals in a human-robot interaction scenario. This study was motivated by specially trained hearing dogs which provide assistance to their deaf owners by using visual communication signals to lead them to the sound source. For our human-robot interaction scenario, a robot was used in place of a hearing dog to lead participants to two different sound sources. The robot was preprogrammed with dog-inspired behaviors, controlled by a wizard who directly implemented the dog behavioral strategy on the robot during the trial. By using dog-inspired visual communication signals as a means of communication, the robot was able to lead participants to the sound sources (the microwave door, the front door). Findings indicate that untrained participants could correctly interpret the robots intentions. Head movements and gaze directions were important for communicating the robots intention using visual communication signals.


robot and human interactive communication | 2005

Hey, I'm over here - How can a robot attract people's attention?

Markus Finke; Kheng Lee Koay; Kerstin Dautenhahn; Chrystopher L. Nehaniv; Michael L. Walters; Joe Saunders

This paper describes how sonar sensors can be used to recognize human movements. The robot distinguishes objects from humans by assuming that only people move by themselves. Two methods using either rules or hidden Markov models are described. The robot classifies different movements to provide a basis for judging if a person is interested in an interaction. A comparison of two experiment results is presented. The use of orienting cues by the robot in response to detected human movement for eliciting interaction is also studied.

Collaboration


Dive into the Kheng Lee Koay's collaboration.

Top Co-Authors

Avatar

Kerstin Dautenhahn

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Michael L. Walters

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Dag Sverre Syrdal

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Sarah Woods

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Joe Saunders

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mick L. Walters

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

R. Te Boekhorst

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Christina Kaouri

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Maha Salem

University of Hertfordshire

View shared research outputs
Researchain Logo
Decentralizing Knowledge