Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dag Sverre Syrdal is active.

Publication


Featured researches published by Dag Sverre Syrdal.


Autonomous Robots | 2008

Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion

Michael L. Walters; Dag Sverre Syrdal; Kerstin Dautenhahn; I. René J. A. te Boekhorst; Kheng Lee Koay

Abstract This article presents the results of video-based Human Robot Interaction (HRI) trials which investigated people’s perceptions of different robot appearances and associated attention-seeking features and behaviors displayed by robots with different appearance and behaviors. The HRI trials studied the participants’ preferences for various features of robot appearance and behavior, as well as their personality attributions towards the robots compared to their own personalities. Overall, participants tended to prefer robots with more human-like appearance and attributes. However, systematic individual differences in the dynamic appearance ratings are not consistent with a universal effect. Introverts and participants with lower emotional stability tended to prefer the mechanical looking appearance to a greater degree than other participants. It is also shown that it is possible to rate individual elements of a particular robot’s behavior and then assess the contribution, or otherwise, of that element to the overall perception of the robot by people. Relating participants’ dynamic appearance ratings of individual robots to independent static appearance ratings provided evidence that could be taken to support a portion of the left hand side of Mori’s theoretically proposed ‘uncanny valley’ diagram. Suggestions for future work are outlined.


robot and human interactive communication | 2008

Human approach distances to a mechanical-looking robot with different robot voice styles

Mick L. Walters; Dag Sverre Syrdal; Kheng Lee Koay; Kerstin Dautenhahn; R. Te Boekhorst

Findings are presented from a Human Robot Interaction (HRI) Demonstration Trial where attendees approached a stationary mechanical looking robot to a comfortable distance. Instructions were given to participants by the robot using either a high quality male, a high quality female, a neutral synthesized voice, or by the experimenter (no robot voice). Approaches to the robot with synthesized voice were found to induce significantly further approach distances. Those who had experienced a previous encounter with the robot tended to approach closer to the robot. Possible reasons for this are discussed.


robot and human interactive communication | 2011

A long-term Human-Robot Proxemic study

Michael L. Walters; Mohammadreza Asghari Oskoei; Dag Sverre Syrdal; Kerstin Dautenhahn

A long-term Human-Robot Proxemic (HRP) study was performed using a newly developed Autonomous Proxemic System (APS) for a robot to measure and control the approach distances to the human participants. The main findings were that most HRP adaptation occurred in the first two interaction sessions, and for the remaining four weeks, approach distance preferences remained relatively steady, apart from some short periods of increased distances for some participants. There were indications that these were associated with episodes where the robot malfunctioned, so this raises the possibility of users trust in the robot affecting HRP distance. The study also found that approach distances for humans approaching the robot and the robot approaching the human were comparable, though there were indications that humans preferred to approach the robot more closely than they allowed the robot to approach them in a physically restricted area. Two participants left the study prematurely, stating they were bored with the repetitive experimental procedures. This highlights issues related to the often incompatible demands of keeping experimental controlled conditions vs. having realistic, engaging and varied HRI trial scenarios.


PLOS ONE | 2013

Robot-Mediated Interviews - How Effective Is a Humanoid Robot as a Tool for Interviewing Young Children?

Luke Jai Wood; Kerstin Dautenhahn; Austen Rainer; Ben Robins; Hagen Lehmann; Dag Sverre Syrdal

Robots have been used in a variety of education, therapy or entertainment contexts. This paper introduces the novel application of using humanoid robots for robot-mediated interviews. An experimental study examines how children’s responses towards the humanoid robot KASPAR in an interview context differ in comparison to their interaction with a human in a similar setting. Twenty-one children aged between 7 and 9 took part in this study. Each child participated in two interviews, one with an adult and one with a humanoid robot. Measures include the behavioural coding of the children’s behaviour during the interviews and questionnaire data. The questions in these interviews focused on a special event that had recently taken place in the school. The results reveal that the children interacted with KASPAR very similar to how they interacted with a human interviewer. The quantitative behaviour analysis reveal that the most notable difference between the interviews with KASPAR and the human were the duration of the interviews, the eye gaze directed towards the different interviewers, and the response time of the interviewers. These results are discussed in light of future work towards developing KASPAR as an ‘interviewer’ for young children in application areas where a robot may have advantages over a human interviewer, e.g. in police, social services, or healthcare applications.


Artificial Life | 2013

Hey! There is someone at your door. A hearing robot using visual communication signals of hearing dogs to communicate intent

Kheng Lee Koay; Gabriella Lakatos; Dag Sverre Syrdal; Márta Gácsi; Boróka Bereczky; Kerstin Dautenhahn; Ádám Miklósi; Michael L. Walters

This paper presents a study of the readability of dog-inspired visual communication signals in a human-robot interaction scenario. This study was motivated by specially trained hearing dogs which provide assistance to their deaf owners by using visual communication signals to lead them to the sound source. For our human-robot interaction scenario, a robot was used in place of a hearing dog to lead participants to two different sound sources. The robot was preprogrammed with dog-inspired behaviors, controlled by a wizard who directly implemented the dog behavioral strategy on the robot during the trial. By using dog-inspired visual communication signals as a means of communication, the robot was able to lead participants to the sound sources (the microwave door, the front door). Findings indicate that untrained participants could correctly interpret the robots intentions. Head movements and gaze directions were important for communicating the robots intention using visual communication signals.


robot and human interactive communication | 2010

Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot

Dag Sverre Syrdal; Kheng Lee Koay; Márta Gácsi; Michael L. Walters; Kerstin Dautenhahn

This paper presents results from a video human-robot interaction (VHRI) study in which participants viewed a video in which an appearance-constrained Pioneer robot used dog-inspired affective cues to communicate affinity and relationship with its owner and a guest using proxemics, body movement and orientation and camera orientation. The findings suggest that even with the limited modalities for non-verbal expression offered by a Pioneer robot, which does not have a dog-like appearance, these cues were effective for non-verbal affective communication.


Connection Science | 2010

Drum-mate: interaction dynamics and gestures in human-humanoid drumming experiments

Hatice Kose-Bagci; Kerstin Dautenhahn; Dag Sverre Syrdal; Chrystopher L. Nehaniv

This article investigates the role of interaction kinesics in human–robot interaction (HRI). We adopted a bottom-up, synthetic approach towards interactive competencies in robots using simple, minimal computational models underlying the robots interaction dynamics. We present two empirical, exploratory studies investigating a drumming experience with a humanoid robot (KASPAR) and a human. In the first experiment, the turn-taking behaviour of the humanoid is deterministic and the non-verbal gestures of the robot accompany its drumming to assess the impact of non-verbal gestures on the interaction. The second experiment studies a computational framework that facilitates emergent turn-taking dynamics, whereby the particular dynamics of turn-taking emerge from the social interaction between the human and the humanoid. The results from the HRI experiments are presented and analysed qualitatively (in terms of the participants’ subjective experiences) and quantitatively (concerning the drumming performance of the human–robot pair). The results point out a trade-off between the subjective evaluation of the drumming experience from the perspective of the participants and the objective evaluation of the drumming performance. A certain number of gestures was preferred as a motivational factor in the interaction. The participants preferred the models underlying the robots turn-taking which enable the robot and human to interact more and provide turn-taking closer to ‘natural’ human–human conversations, despite differences in objective measures of drumming behaviour. The results are consistent with the temporal behaviour matching hypothesis previously proposed in the literature which concerns the effect that the participants adapt their own interaction dynamics to the robots.


International Journal of Social Robotics | 2012

Evaluation of the Robot Assisted Sign Language Tutoring Using Video-Based Studies

Hatice Kose; Rabia Yorganci; Esra H. Algan; Dag Sverre Syrdal

The results are from an on-going study which aims to assist in the teaching of Sign Language (SL) to hearing impaired children by means of non-verbal communication and imitation based interaction games between a humanoid robot and the child. In this study, the robot will be able to express a word in the SL among a set of chosen words using hand movements, body and face gestures and having comprehended the word, the child will give relevant feedback to the robot.This paper reports the findings of such an evaluation on a subset of sample words chosen from Turkish Sign Language (TSL) via the comparison of their video representations carried out by human teachers and the Nao H25 robot. Within this study, several surveys and user studies have been realized to reveal the resemblance between the two types of videos involving the performance of the robot simulator and the human teacher for each chosen word. In order to investigate the perceived level of similarity between human and robot behavior, participants with different sign language acquaintance levels and age groups have been asked to evaluate the videos using paper based and online questionnaires. The results of these surveys have been summarized and the most significant factors affecting the comprehension of TSL words have been discussed.


Paladyn: Journal of Behavioral Robotics | 2013

Assistive technology design and development for acceptable robotics companions for ageing years

Farshid Amirabdollahian; R. op den Akker; Sandra Bedaf; Richard Bormann; Heather Draper; Vanessa Evers; J. Gallego Pérez; GertJan Gelderblom; C. Gutierrez Ruiz; David J. Hewson; Ninghang Hu; Ben J. A. Kröse; Hagen Lehmann; Patrizia Marti; H. Michel; H. Prevot-Huille; Ulrich Reiser; Joe Saunders; Tom Sorell; J. Stienstra; Dag Sverre Syrdal; Mick L. Walters; Kerstin Dautenhahn

Abstract A new stream of research and development responds to changes in life expectancy across the world. It includes technologies which enhance well-being of individuals, specifically for older people. The ACCOMPANY project focuses on home companion technologies and issues surrounding technology development for assistive purposes. The project responds to some overlooked aspects of technology design, divided into multiple areas such as empathic and social human-robot interaction, robot learning and memory visualisation, and monitoring persons’ activities at home. To bring these aspects together, a dedicated task is identified to ensure technological integration of these multiple approaches on an existing robotic platform, Care-O-Bot®3 in the context of a smart-home environment utilising a multitude of sensor arrays. Formative and summative evaluation cycles are then used to assess the emerging prototype towards identifying acceptable behaviours and roles for the robot, for example role as a butler or a trainer, while also comparing user requirements to achieved progress. In a novel approach, the project considers ethical concerns and by highlighting principles such as autonomy, independence, enablement, safety and privacy, it embarks on providing a discussion medium where user views on these principles and the existing tension between some of these principles, for example tension between privacy and autonomy over safety, can be captured and considered in design cycles and throughout project developments.


Archive | 2011

Companion Migration – Initial Participants’ Feedback from a Video-Based Prototyping Study

Kheng Lee Koay; Dag Sverre Syrdal; Kerstin Dautenhahn; K. Arent; Ł Małek; B. Kreczmer

This chapter presents findings from a user study which investigated users’ perceptions and their acceptability of a Companion and associated ’personality’ which migrated between different embodiments (i.e. avatar and robot) to accomplish its tasks. Various issues such as Companion migration decision, Retention of Companion identity in different embodiments, Personalisation of Companion, users’ privacy and control over the technology are discussed. Authorisation guidelines for Companions regarding migration, accessing an embodiment and the data stored in the embodiment are proposed and discussed for future design of migration Companion.

Collaboration


Dive into the Dag Sverre Syrdal's collaboration.

Top Co-Authors

Avatar

Kerstin Dautenhahn

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Kheng Lee Koay

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Michael L. Walters

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Hagen Lehmann

Istituto Italiano di Tecnologia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ben Robins

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Luke Jai Wood

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar

Sandra Bedaf

Zuyd University of Applied Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge