Aidan Jones
University of Birmingham
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Aidan Jones.
international conference on social robotics | 2015
Aidan Jones; Dennis Küster; Christina Anne Basedow; Patrícia Alves-Oliveira; Sofia Serholt; Helen Hastie; Lee J. Corrigan; Wolmet Barendregt; Arvid Kappas; Ana Paiva; Ginevra Castellano
Within any learning process, the formation of a socio-emotional relationship between learner and teacher is paramount to facilitating a good learning experience. The ability to form this relationship may come naturally to an attentive teacher; but how do we endow an unemotional robot with this ability? In this paper, we extend upon insights from the literature to include tools from user-centered design (UCD) and analyses of human-human interaction (HHI) as the basis of a multidisciplinary approach in the development of an empathic robotic tutor. We discuss the lessons learned in respect to design principles with the aim of personalised learning with empathic robotic tutors.
intelligent virtual agents | 2014
Tiago Ribeiro; Eugenio Di Tullio; Lee J. Corrigan; Aidan Jones; Fotios Papadopoulos; Ruth Aylett; Ginevra Castellano; Ana Paiva
We address the situation of developing interactive scenarios featuring embodied characters that interact with users through various types of media easily presents as a challenge. Some of the problems that developers face are on collaborating while developing remotely, integrating all the independently developed components, and incrementally developing a system in such way that the developed components can be used since their incorporation, throughout the intermediate phases of development, and on to the final system. We describe how the Thalamus framework addresses these issues, and how it is being used on a large project that targets developing this type of scenarios. A case study is presented, illustrating actual development of such scenario which was then used for a Wizard-of-Oz study.
international conference on social robotics | 2014
Aidan Jones; Ginevra Castellano; Susan Bull
In this paper we investigate the effect of different embodiments on perception of a skill based feedback (a basic open learner model) with a robotic tutor. We describe a study with fifty-one 11-13 year old learners. Each learner carries out a geography based activity on a touch table. A real time model of the learner’s skill levels is built based on the learner’s interaction with the activity. We explore three conditions where the contents of this learner model is fed back to the learner with different levels of embodiment: (1) Full embodiment, where skill levels are presented and explained solely by a robot; (2) Mixed embodiment, where skill levels are presented on a screen with explanation by a robot; and (3) No embodiment, where skill levels and explanation are presented on a screen with no robot. The findings suggest that embodiment can increase enjoyment, understanding, and trust in explanations of an open learner model.
Ai & Society | 2017
Sofia Serholt; Wolmet Barendregt; Asimina Vasalou; Patrícia Alves-Oliveira; Aidan Jones; Sofia Petisca; Ana Paiva
Robots are increasingly being studied for use in education. It is expected that robots will have the potential to facilitate children’s learning and function autonomously within real classrooms in the near future. Previous research has raised the importance of designing acceptable robots for different practices. In parallel, scholars have raised ethical concerns surrounding children interacting with robots. Drawing on a Responsible Research and Innovation perspective, our goal is to move away from research concerned with designing features that will render robots more socially acceptable by end users toward a reflective dialogue whose goal is to consider the key ethical issues and long-term consequences of implementing classroom robots for teachers and children in primary education. This paper presents the results from several focus groups conducted with teachers in three European countries. Through a thematic analysis, we provide a theoretical account of teachers’ perspectives on classroom robots pertaining to privacy, robot role, effects on children and responsibility. Implications for the field of educational robotics are discussed.
human robot interaction | 2015
Amol Deshmukh; Aidan Jones; Srinivasan Chandrasekaran Janarthanam; Mary Ellen Foster; Tiago Ribeiro; Lee J. Corrigan; Ruth Aylett; Ana Paiva; Fotios Papadopoulos; Ginevra Castellano
In this demonstration we describe a scenario developed in the EMOTE project. The overall goal of the project is to develop an empathic robot tutor for 11-13 year old school students in an educational setting. We are aiming to develop an empathic robot tutor to teach map reading skills with this scenario on a touch-screen device.
human robot interaction | 2015
Aidan Jones; Susan Bull; Ginevra Castellano
This paper describes research to explore how personalisation in a robot tutor using an open leaner model (OLM) based approach impacts on effectiveness of childrens learning. An OLM is a visualisation of inferred knowledge state. We address the feasibility of using social robotics to present an OLM to a learner. Results to date indicate that a robotic tutor can increase trust in explanations of an OLM over text based representations. We outline the remaining work to create and evaluate an autonomous robotic tutor that will use an OLM to scaffold reflection.
International Journal of Social Robotics | 2018
Aidan Jones; Ginevra Castellano
Robots are increasingly being used to provide motivating, engaging and personalised support to learners. These robotic tutors have been able to increase student learning gain by providing personalised hints or problem selection. However, they have never been used to assist children in developing self regulated learning (SRL) skills. SRL skills allow a learner to more effectively self-assess and guide their own learning; learners that engage these skills have been shown to perform better academically. This paper explores how personalised tutoring by a robot achieved using an open learner model (OLM) promotes SRL processes and how this can impact learning and SRL skills compared to personalised domain support alone. An OLM allows the learner to view the model that the system holds about them. We present a longer-term study where participants take part in a geography-based task on a touch screen with adaptive feedback provided by the robot. In addition to domain support the robotic tutor uses an OLM to prompt the learner to monitor their developing skills, set goals, and use appropriate tools. Results show that, when a robotic tutor personalises and adaptively scaffolds SRL behaviour based upon an OLM, greater indication of SRL behaviour can be observed over the control condition where the robotic tutor only provides domain support and not SRL scaffolding.
International Journal of Social Robotics | 2018
Aidan Jones; Susan Bull; Ginevra Castellano
Robots are increasingly being used to provide motivating, engaging and personalised support to learners. Robotic tutors have been able to increase student learning gain by providing personalised hints or problem selection. However, they have never been used to assist children in developing self regulated learning (SRL) skills. SRL skills allow a learner to more effectively self-assess and guide their own learning; learners that engage these skills have been shown to perform better academically. This paper explores how personalised tutoring by a robot achieved using an open learner model (OLM) promotes SRL processes and how this can impact learning. It presents a study where a robotic tutor supports reflection and SRL processes with an OLM. An OLM allows the learner to view the model that the system holds about them. In this study, participants take part in a geography-based task on a touch screen with different levels of adaptive feedback provided by the robot. The robotic tutor uses an OLM to prompt the learner to monitor their developing skills, set goals, and use appropriate tools. Results show that, when a robotic tutor personalises and adaptively scaffolds SRL behaviour based upon an OLM, greater indication of SRL behaviour and increased learning gain can be observed over control conditions where the robotic tutor does not provide SRL scaffolding. We also find that pressure and tension in the activity increases and perception of the robot is less favourable in conditions where the robotic tutor makes the learner aware that there are issues but does not provide specific help to address these issues.
What Social Robots Can and Should Do: Proceedings of Robophilosophy 2016 / TRANSOR 2016 | 2016
Sofia Serholt; Wolmet Barendregt; Dennis Küster; Aidan Jones; Patrícia Alves-Oliveira; Ana Paiva
As robots are becoming increasingly common in society and education, it is expected that autonomous and socially adaptive classroom robots may eventually be given responsible roles in primary education. In this paper, we present the results of a questionnaire study carried out with students enrolled in compulsory education in three European countries. The study aimed to explore students’ normative perspectives on classroom robots pertaining to roles and responsibilities, student-robot relationships, and perceptive and emotional capabilities in robots. The results suggest that, although students are generally positive toward the existence of classroom robots, certain aspects are deemed more acceptable than others.
human robot interaction | 2015
Tiago Ribeiro; Patrícia Alves-Oliveira; Eugenio Di Tullio; Sofia Petisca; Pedro Sequeira; Amol Deshmukh; Srinivasan Chandrasekaran Janarthanam; Mary Ellen Foster; Aidan Jones; Lee J. Corrigan; Fotios Papadopoulos; Helen Hastie; Ruth Aylett; Ginevra Castellano; Ana Paiva
We present an autonomous empathic robotic tutor to be used in classrooms as a peer in a virtual learning environment. The system merges a virtual agent design with HRI features, consisting of a robotic embodiment, a multimedia interactive learning application and perception sensors that are controlled by an artificial intelligence agent.