Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robin Read is active.

Publication


Featured researches published by Robin Read.


international conference on social robotics | 2013

Child-Robot Interaction: Perspectives and Challenges

Tony Belpaeme; Paul Baxter; Joachim de Greeff; James Kennedy; Robin Read; Rosemarijn Looije; Mark A. Neerincx; Ilaria Baroni; Mattia Coti Zelati

Child-Robot Interaction (cHRI) is a promising point of entry into the rich challenge that social HRI is. Starting from three years of experiences gained in a cHRI research project, this paper offers a view on the opportunities offered by letting robots interact with children rather than with adults and having the interaction in real-world circumstances rather than lab settings. It identifies the main challenges which face the field of cHRI: the technical challenges, while tremendous, might be overcome by moving away from the classical perspective of seeing social cognition as residing inside an agent, to seeing social cognition as a continuous and self-correcting interaction between two agents.


human-robot interaction | 2012

How to use non-linguistic utterances to convey emotion in child-robot interaction

Robin Read; Tony Belpaeme

Vocal affective displays are vital for achieving engaging and effective Human-Robot Interaction. The same can be said for linguistic interaction also, however, while emphasis may be placed upon linguistic interaction, there are also inherent risks: users are bound to a single language, and breakdowns are frequent due to current technical limitations. This work explores the potential of non-linguistic utterances. A recent study is briefly outlined in which school children were asked to rate a variety of non-linguistic utterances on an affective level using a facial gesture tool. Results suggest, for example, that utterance rhythm may be an influential independent factor, whilst the pitch contour of an utterance may have little importance. Also evidence for categorical perception of emotions is presented, an issue that may impact important areas of HRI away from vocal displays of affect.


human-robot interaction | 2014

Situational context directs how people affectively interpret robotic non-linguistic utterances

Robin Read; Tony Belpaeme

This paper presents an experiment investigating the influence that a situational context has upon how people affectively interpret Non-Linguistic Utterances made by a social robot. Subjects were presented five video conditions showing the robot making both a positive and negative utterance, the robot being subject to an action (e.g. receiving a kiss, or a slap), and then two videos showing the combination of the action and the robot reacting with both the positive and negative utterances. For each video an affective rating of valence was provided based upon how the subjects thought the robot felt given what had happened in the video. This was repeated for 5 different action scenarios. Results show that the affective interpretation of an action appears to override that of an utterance, regardless of the affective charge of the utterance. Furthermore, it is shown that if the meaning of the action and utterance are aligned, the overall interpretation is amplified. These findings are considered with respect to the practical use of utterances during social HRI.


human-robot interaction | 2013

People interpret robotic non-linguistic utterances categorically

Robin Read; Tony Belpaeme

We present results of an experiment probing whether adults exhibit categorical perception when affectively rating robot-like sounds (Non-linguistic Utterances). The experimental design followed the traditional methodology from the psychology domain for measuring categorical perception: stimulus continua for robot sounds were presented to subjects, who were asked to complete a discrimination and an identification task. In the former subjects were asked to rate whether stimulus pairs were affectively different, while in the latter they were asked to rate single stimuli affectively. The experiment confirms that Non-linguistic Utterances can convey affect and that they are drawn towards prototypical emotions, confirming that people show categorical perception at a level of inferred affective meaning when hearing robot-like sounds. We speculate on how these insights can be used to automatically design and generate affect-laden robot-like utterances.


PLOS ONE | 2017

Robot education peers in a situated primary school study: Personalisation promotes child learning

Paul Baxter; Emily J Ashurst; Robin Read; James Kennedy; Tony Belpaeme

The benefit of social robots to support child learning in an educational context over an extended period of time is evaluated. Specifically, the effect of personalisation and adaptation of robot social behaviour is assessed. Two autonomous robots were embedded within two matched classrooms of a primary school for a continuous two week period without experimenter supervision to act as learning companions for the children for familiar and novel subjects. Results suggest that while children in both personalised and non-personalised conditions learned, there was increased child learning of a novel subject exhibited when interacting with a robot that personalised its behaviours, with indications that this benefit extended to other class-based performance. Additional evidence was obtained suggesting that there is increased acceptance of the personalised robot peer over a non-personalised version. These results provide the first evidence in support of peer-robot behavioural personalisation having a positive influence on learning when embedded in a learning environment for an extended period of time.


human-robot interaction | 2014

Non-linguistic utterances should be used alongside language, rather than on their own or as a replacement

Robin Read; Tony Belpaeme

This paper presents the results of a small experiment aimed at determining whether people are comfortable with a social robot that uses robotic Non-Linguistic Utterances alongside Natural Language, rather than as a replacement. The results suggest that while people have the most preference for a robot that uses only natural language, a robot that combines NLUs and natural language is seen as more preferable than a robot that only employes NLUs. This suggests that there is potential for NLUs to be used in combination with natural language. In light of this, potential utilities and motivations for using NLUs in such a manner are outlined. Categories and Subject Descriptors H.5.2 [User Interfaces]: Auditory (non-speech) feedback; H.5.5 [Sound and Music Computing]: Systems; I.2.9 [Robotics]: Operator interfaces General Terms Design, Human Factors, Experimentation, Theory


human-robot interaction | 2013

Using the AffectButton to measure affect in child and adult-robot interaction

Robin Read; Tony Belpaeme

This report presents data which shows how the AffectButton, a visual tool to report affect, can be used reliably by both adults and children (6-7 y.). Users were asked to identify affective labels, such as scared or surprised, on the AffectButton. We report a high inter-rater reliability between adults, between children and between adults and children. Children have the same high performance when using the AffectButton as adults, making the AffectButton a intuitive and reliable tool for letting a wide range of ages report affect.


Science Robotics | 2018

Children conform, adults resist: A robot group induced peer pressure on normative social conformity

Anna-Lisa Vollmer; Robin Read; Dries Trippas; Tony Belpaeme

Children increasingly yielded to social pressure exerted by a group of robots; however, adults resisted being influenced by our robots. People are known to change their behavior and decisions to conform to others, even for obviously incorrect facts. Because of recent developments in artificial intelligence and robotics, robots are increasingly found in human environments, and there, they form a novel social presence. It is as yet unclear whether and to what extent these social robots are able to exert pressure similar to human peers. This study used the Asch paradigm, which shows how participants conform to others while performing a visual judgment task. We first replicated the finding that adults are influenced by their peers but showed that they resist social pressure from a group of small humanoid robots. Next, we repeated the study with 7- to 9-year-old children and showed that children conform to the robots. This raises opportunities as well as concerns for the use of social robots with young and vulnerable cross-sections of society; although conforming can be beneficial, the potential for misuse and the potential impact of erroneous performance cannot be ignored.


human-robot interaction | 2014

The chatbot strikes back

James Kennedy; Joachim de Greeff; Robin Read; Paul Baxter; Tony Belpaeme

Categories and Subject DescriptorsCategories and Subject DescriptorsH.1.2 [Models and Principles]: User/Machine SystemsGeneral TermsExperimentation, Human Factors, Theory


human robot interaction | 2013

Multimodal child-robot interaction: building social bonds

Tony Belpaeme; Paul Baxter; Robin Read; Rachel Wood; Heriberto Cuayáhuitl; Bernd Kiefer; Stefania Racioppa; Ivana Kruijff-Korbayová; Georgios Athanasopoulos; Valentin Enescu; Rosemarijn Looije; Mark A. Neerincx; Yiannis Demiris; Raquel Ros-Espinoza; Aryel Beck; Lola Cañamero; Antione Hiolle; Matthew Lewis; Ilaria Baroni; Marco Nalin; Piero Cosi; Giulio Paci; Fabio Tesser; Giacomo Sommavilla; Rémi Humbert

Collaboration


Dive into the Robin Read's collaboration.

Top Co-Authors

Avatar

Tony Belpaeme

Plymouth State University

View shared research outputs
Top Co-Authors

Avatar

Paul Baxter

Plymouth State University

View shared research outputs
Top Co-Authors

Avatar

James Kennedy

Plymouth State University

View shared research outputs
Top Co-Authors

Avatar

Joachim de Greeff

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Mark A. Neerincx

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ilaria Baroni

Vita-Salute San Raffaele University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Emily J Ashurst

Plymouth State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge