Jaeryoung Lee
Chubu University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jaeryoung Lee.
International Journal of Advanced Robotic Systems | 2012
Jaeryoung Lee; Hiroki Takehashi; Chikara Nagai; Goro Obinata; Dimitar Stefanov
This study explores the response of autistic children to a few design features of the robots for autism therapy and provides suggestions on the robot features that have a stronger influence on the therapeutic process. First, we investigate the effect of selected robot features on the development of social communication skills in autistic children. The results indicate that the toys “face” and “moving limb” usually draw the childrens attention and improve childrens facial expression skills, but do not contribute to the development of other social communication skills. Secondly, we study the response of children with low-functioning autism to robots with verbal communication functionalities. Test results show that children interacted with the verbal-featured robot more intensively than with the experimenter. We conclude that robots with faces and moving limbs can engage autistic children in a better way. Facial expression of the robots can elicit a greater response than prompting by humans.
human robot interaction | 2017
Ryo Suzuki; Jaeryoung Lee; Ognjen Rudovic
Children with Autism Spectrum Disorder (ASD) have very short attention span, and autism therapy requires the entertainment for longer interaction for them. Robots draw their attention to focus on the therapy and learn the social skills. In this paper, NAO is used for as part of a dance therapy for children with ASD. To explore the effectiveness, we compared three settings involving NAO, therapist and/or unfamiliar person, during the dance. Our results indicate that a robot can be an effective education agent for children with ASD, and in particular, as part of the dance therapy.
international symposium on micro-nanomechatronics and human science | 2016
Ryo Suzuki; Jaeryoung Lee
Playing with Robot has become an effective therapy method for children with Autism Spectrum Disorders (ASD). In this study, we explore the prosocial behaviour of children with ASD through a musical play with a humanoid robot, NAO. The children participated in two different play platforms, Touch Body and Dancing with Me. The game scores and close proximity are measured to assess their understanding and prosocial behaviours. As merging the musical therapy into social robot system, children with ASD could show more playful and positive results in the therapy and promote prosocial behaviours.
human-robot interaction | 2014
Jaeryoung Lee; Goro Obinata; Hirofumi Aoki
It is an advantage to use robots in autism therapy since they provide repetitive stimuli during the learning of social skills. A detailed exploration is required to design assistive robots for autism therapy with higher effectiveness. This study has investigated the effectiveness of an interactive robots feedback during autism therapy for improving specific social skills of the children. The experiment described in this study is that children with autism engaged with an interactive system to perform affective touch behaviour and received the robots visual and auditory feedback. Results showed that the presence of an interactive robots feedback triggered more effective childrens touch behaviour.Categories and Subject Descriptors H.5.2 [Information interface and presentation]: User Interfaces -User-centered design
Paladyn: Journal of Behavioral Robotics | 2014
Jaeryoung Lee; Hiroki Takehashi; Chikara Nagai; Goro Obinata; Dimitar Stefanov
Abstract Previous studies in the field of robot assisted therapy demonstrated that robots engage autistic children’s attention in a better way. Therefore, the interactive robots appear to be a promising approach for improving the social interaction and communication skills of autistic children. However, most of the existing interactive robots use a very small number of communication variableswhich narrow their effectiveness to a few aspects of autistic childrens’ social communication behaviour. In the present work, we explore the effects of touching and colours on the communication effectiveness between a robot and an autistic child and their potential for further adjustability of the robot to child’s behaviour. Firstly, we investigated touching patterns of autistic and non-autistic children in three different situations and validated their responses by comparison of touching forces. Results showed that patterns of touching by non-autistic children have certain consistency, while reaction patterns in autistic children vary from person to person. Secondly, we studied the effect of colour feedback in autism therapy with the robot. Results showed that participants achieved better completion rate when colour feedback was provided. The results could support the design of more effective therapeutic robots for children with autism.
Paladyn: Journal of Behavioral Robotics | 2013
Jaeryoung Lee; Goro Obinata; Dimitar Stefanov; Chikara Nagai
Abstract Interactive robots are seen as an efficient tool for the improvement of the social communication skills of autistic children. Recent studies show that the effectiveness of the human-robot interaction can be improved further if the robot can provide positive feedback to the child when he/she demonstrates behaviour or social skills as expected. However, there is no clear answer to which visual stimuli and which combination of visual stimuli could attract better attention. In this paper we present initial results from our study of the response of participants with autism traits to four visual stimuli. We conducted a series of experiments where the experimental system provided a visual response to the user’s actions and monitored the user’s performance for each visual stimulus. The experiments were organised as a game and included four groups of participants with different levels of autism. The results showed that a colour tended to be the most effective way for robot interaction with autistic people. The results could help the design of very effective assistive robots for supporting people with autism.
international symposium on micro-nanomechatronics and human science | 2015
Min-Gyu Kim; Hye Won Lee; Jaeryoung Lee; Sonya S. Kwak; Younghwan Joo
This study presents an initial investigation of effectiveness and service quality of robot museum through visitors experience. The survey research was performed in RoboLife Museum in South Korea, targeting teenagers who are most of the visitors of the museum. Impression, intimacy and attitude toward robots were measured to analyze change of the perception about robots through visiting experience. The results showed that impression and attitude were significantly changed after viewing robot exhibitions. However, intimacy showed no significant change. It was considered that impression and attitude can be affected by a single occasional experience, but intimacy needs actual interactions between visitors and robots such as making a short conversation or playing games together. The service quality, customers satisfaction and loyalty were further investigated. The overall evaluation for the service provided by RoboLife museum was highly scored.
human-robot interaction | 2013
Jaeryoung Lee; Goro Obinata
Previous studies have reported that autistic children improved the social interaction and communication skills through interacting with robots. Most studies in the field of robot-assisted autism therapy, however, have focused on limited communication skills. Moreover, these studies used non-validated methods to measure the effectiveness of the therapy. Thus, in the present study, a therapeutic robot is proposed for autistic children to improve the adjustability of the interpersonal touch as one of communication skills. The aim of this study is to investigate the effective way of feedback in autism therapy with the robot using colours in three different conditions. As a result, the participants have indicated that better interaction, when they saw the colour directly.
arXiv: Robotics | 2018
Ognjen Rudovic; Jaeryoung Lee; Miles Dai; Björn W. Schuller; Rosalind W. Picard
Personalized machine learning enables robot perception of children’s affective states and engagement during robot-assisted autism therapy. Robots have the potential to facilitate future therapies for children on the autism spectrum. However, existing robots are limited in their ability to automatically perceive and respond to human affect, which is necessary for establishing and maintaining engaging interactions. Their inference challenge is made even harder by the fact that many individuals with autism have atypical and unusually diverse styles of expressing their affective-cognitive states. To tackle the heterogeneity in children with autism, we used the latest advances in deep learning to formulate a personalized machine learning (ML) framework for automatic perception of the children’s affective states and engagement during robot-assisted autism therapy. Instead of using the traditional one-size-fits-all ML approach, we personalized our framework to each child using their contextual information (demographics and behavioral assessment scores) and individual characteristics. We evaluated this framework on a multimodal (audio, video, and autonomic physiology) data set of 35 children (ages 3 to 13) with autism, from two cultures (Asia and Europe), and achieved an average agreement (intraclass correlation) of ~60% with human experts in the estimation of affect and engagement, also outperforming nonpersonalized ML solutions. These results demonstrate the feasibility of robot perception of affect and engagement in children with autism and have implications for the design of future autism therapies.
Paladyn: Journal of Behavioral Robotics | 2018
Nicholas de Bastos Melo; Jaeryoung Lee
Abstract The interest towards robots for elderly care has been growing in the last years. Systems aiming to integrate robot interactive components and the user’s activity recognition system are increasing as well. This work presents an activity aware intelligent system that supports user in his/her daily life tasks. The proposed system aims to integrate three important aspects into a smart house application (environment monitoring, user activity recognition and user friendly interaction). The information gathered from sensors across the environment is structured as the state of the environment in a compacted form called activity frame. This specific frame is used by a predictor (based on the decision tree method), in order to recognize the activities that have been performed by the user inside his/her domestic environment. The recognized activity is used by an user-interactive component, which uses the predicted behavior as a guideline for its interaction planner. The presented activity recognition system was tested with the data provided by different smart home projects, and the recognition rate for the proposed predictor has high recognition rate compared to other similar ones. The architecture described by the sensory network allows the system to be easily implemented in real time in a smart house context.