Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeonghye Han is active.

Publication


Featured researches published by Jeonghye Han.


human-robot interaction | 2009

r-Learning services for elementary school students with a teaching assistant robot

Jeonghye Han; Dong-Ho Kim

The r-Learning paradigm with educational robots is emerging as a part of e-Learning, which means using technology for learning. This study on using robots as a teaching assistant robot opened the possibility of r-Learning for English in classroom. We found that children like robot services for personal relationship in class and teachers prefer them related to their convenience to manage the lesson. Related robot services such as praising and cheering up or calling the roll are the effective way for motivating children to learn, enhancing the relationship between TIRO and children. We are going on conducting further field trials for new scenarios and services that motivate children and make them concentrate on class with teachers, pre-teachers, children, parents, robotic researchers, social scientists, etc.


robot and human interactive communication | 2005

Evolutionary role model and basic emotions of service robots originated from computers

Jeonghye Han; Jaeyeon Lee; Young-Jo Cho

This paper proposes an evolutionary role model for service robots with LCD touch panels such as home robots, and suggests the necessity of reestablishment of robot emotions on the basis of the new role model. The typical HRI-based peer role model is appropriate for the android, the future intelligent robot, but it failed to take into account the state of temporary robotics. In this study, a role model was proposed by allowing for the evolutionary aspects in step with the level of robot technology. Also, the feasibility of the proposed evolutionary role model was demonstrated through the experiments after parents and children interacted with a home robot that had various functions for family members. The results showed that the expected roles of home robots are user, secretary, and peer. Most of parents thought of them as human-like machines while children regarded them as peers. Home robots facial expressions depend on their roles. So, we investigated the facial expression set of home robots at work and compared them with Ekmans. We also hypothesized that if the facial expression set for a home robot at work is different from the expected one, it had a significant impact on evaluation. The results suggested the necessity of new modelling of facial expressions for home robots based on the evolutionary role model.


International Journal of Pedagogies and Learning | 2006

The Future of Robot-Assisted Learning in the Home

Vicki Jones; Jun Hyung Jo; Jeonghye Han

Abstract Imagine a home system where children can have an educational assistant with them at all times – a helper to ensure that they understand and are understood. The concept of robots interacting with humans is not new and was predicted in many movies and novels and on television long before the technology was available. With the robot revolution upon us, small-scale household robots are becoming more accepted and widespread. The majority of current household robot applications take the role of service robots in the home, undertaking menial tasks. However, their use in education has great potential. With human–robot interface (HRI) technology, this educational scenario is not only possible but also probable. In the future, household robots will provide the physical interface and mobility for these home-based e-learning systems. It is also envisaged that ubiquitous robots, which consist of embedded, mobile and software robots, will became essential in home network systems. In this paper we anticipate that the software robot, a type of virtual robot, will become the core of many robot-based e-learning systems which will be integrated with household robots. These e-learning software robots can traverse time and space and assist the child at any time and any place and connect to any device through a network. In this paper, we discuss the use of home robots, HRI and software robot-assisted learning which constitutes an e-learning system for young children within the home environment.


human-robot interaction | 2010

A trial english class with a teaching assistant robot in elementary school

Jeonghye Han; Seungmin Lee Lee; Bokhyun Kang; Sungju Park; Jungkwan Kim; Myungsook Kim; Mihee Kim

Various studies propose that robots can be an effective tool for language teaching and learning. Especially they have been remarkably successful in elementary English classes [1][2][3][4]. The purpose of this study was to investigate some effects of a teaching assistant robot, Langbot, in elementary English classes in Korea. We adopted IROBIQ as Longbot for a pilot study. We designed some activities for elementary English classes using a teaching assistant robot, Langbot: introduction, look and listen, listen and say, look and say, act out, song and chant. The introduction includes the birth story of Langbot that children want to know where the robot comes from, how old it is, why it came to their classroom, etc, since Hur and Han (2009) found that the robot storytelling was working to increase childrens tolerance toward the failure of recognition of a robot [2].


Archive | 2007

What People Assume about Robots: Cross-Cultural Analysis between Japan, Korea, and the USA

Tatsuya Nomura; Tomohiro Suzuki; Takayuki Kanda; Jeonghye Han; Namin Shin; Jennifer L. Burke; Kensuke Kato

Tatsuya Nomura1,3, Tomohiro Suzuki2, Takayuki Kanda3, Jeonghye Han4, Namin Shin5, Jennifer Burke6 and Kensuke Kato7 1 Ryukoku University, 2 JSPS Research Fellow and Toyo University, 3 ATR Intelligent Robotics and Communication Laboratories, 4 Cheongju National University of Education, 5 Dongguk University, 6 University of South Florida, 7 Kyushu University of Health and Welfare 1, 2, 3,7Japan, 4, 5Korea, 6 USA


Journal of Information Processing Systems | 2006

Metaphor and Typeface Based on Children`s Sensibilities for e-Learning

Miheon Jo; Jeonghye Han

Abstract: Children exhibit different behaviors, skills, and motivations. The main aim of this research was to investigate children’s sensibility factors for icons, and to look for the best typeface for application to Web-Based Instruction (WBI) for e-Learning. Three types of icons were used to assess children’s sensibilities toward metaphors: text-image, representational, and spatial mapping. Through the factor analysis, we found that children exhibited more diverse reactions to the text-image and representational types of icons than to the spatial mapping type of icons. Children commonly showed higher sensibilities to the aesthetic-factor than to the familiarity-factor or the brevity-factor. In addition, we propose a collaborative-typeface system, which recommends the best typeface for children regarding the readability and aesthetic factor in WBI. Based on these results, we venture some suggestions on icon design and typeface selection for e-Learning. Keywords: e-Learning, Sensibility Factor, Metaphor, Typeface, Collaborative Recommending


Cluster Computing | 2016

Teachers' views on the use of robots and cloud services in education for sustainable development

Ill-Woo Park; Jeonghye Han

Various studies have shown the educational use of robots to be effective in science and mathematics education. However, such studies have not considered the psychological factors affecting users of the new technology, only external factors, such as the range of affordable robotic platforms and ready-for-lesson materials for a robot-assisted learning environment. It is necessary to extend the use of robots and cloud platforms to support education for sustainable development. To that end, this study first assessed the possibility of using robots in education for sustainable development by providing them to children from low-income families, since they often show abnormal behaviors and have few opportunities to access robots in education. The long-term changes in their behavior resulting from this outreach program were examined. Qualitative as well as quantitative methods were used to evaluate and discuss the changes in self-efficacy and learning attitudes of students during the year. Second, we proposed a technology acceptance model, termed RSAM, for teachers in robot-assisted learning environments with a cloud service platform. Acceptance factors were estimated using a weighted average method based on teacher focus group interviews. The challenges associated with robot-assisted learning considering cloud services are discussed.


human-robot interaction | 2014

Is a robot better than video for initiating remote social connections among children

Nuri Kim; Jeonghye Han; Wendy Ju

To investigate how children interact differently when interactions are mediated with screen-based video communication versus a robot-mediated communication, we conducted a study with elementary students in Korea, comparing the use of both technologies to introduce classroom students with peer-aged individuals in America. Our findings show that the classroom children showed more positive emotion during certain tasks and exhibited more interest to remote participants in the context of robotmediated communication than with video-mediated communication. Categories and Subject Descriptors H.4.3 Communications Applications: Computer conferencing,teleconferencing, and videoconferencingGeneral TermsHuman Factors, Languages.


human robot interaction | 2018

Social Proxemics of Human-Drone Interaction: Flying Altitude and Size

Jeonghye Han; Ilhan Bae

To what extent do humans comfortably approach to hovering drones? In human-robot interaction, social proxemics is somewhat known. Han & Bae showed that students usually stand as far apart as the height of tele-robot teacher [1]. As commercial drone markets rise, the social proximity in human-drone interaction becomes an important issue. Researches on measuring the social proximity of interacting with drones are still in early stages. Jane showed that Chinese participants approach flying drone closer than American participants [2]. Abtahi experimented with an unsafe and a safe-to-touch drone, to check whether participants instinctively use touch for interacting with the safe-to-touch drones [3]. We aimed the first research on how people respond to the order to approach hovering drones which differs in size and flying altitudes under the conditions that a safety issue was secured enough. Two types of drones: small and big sized ones were prepared. Each drone flew 1.6m of eye level or 2.6m of overhead high. Total 32 participants with an average age of 22.64 were individually to stand 11.5 Feet away from hovering drones in 2x2 conditions: two sizes and two flying altitudes. Only one of participants experienced operating drones. A transparent safety panel was installed between hovering drone and participants. Each participant was technically allowed to move from the standing point 6.5 Feet away from a safety panel. A remote drone operator who controlled a hovering drone made a short conversation with a participant who stood behind a safety panel via a loud speaker system connected to a cellular phone in the experiment spot. After a participant recognized the drone as the extension of a remote operator, the participant was asked to move forward to hear a remote operator better. The experiment results showed that participants approached further when interacting with eye leveled drones compared with overhead drones. Flight altitude matters in social proximity of human-drone interaction with a significant level ?=0.2. Females moved closer to a big and eye-level drone. 31 participants entered into social space to interact with drones, and only one approached less than two Feet to be still in public space from drones. Gender and size of drone did not make significant differences in social proximity of human-drone interaction. This experiment has an evident limit of measuring the proxemics of participants under the cover of an acryl panel, which must have been installed for safety of any experiments of human-drone proximity. Nonetheless, the results imply that most South Korean participants might be ready to comfortably enter social space to interact with drones, and hovering drones in eye-level altitude seem to promote this attitude.


human robot interaction | 2016

Children's Perceptions of and Interactions with a Telepresence Robot

Kyoung Wan Cathy Shin; Jeonghye Han

The primary concern of this study is to explore childrens engagement and learning experiences in distant learning environments using three different videoconferencing technologies-two traditional screen-based videoconferencing technologies and a remotely-controlled telepresence robot. To truthfully capture the perspectives of the children, we employed multiple data collection methods-rating scales, narratives, ranking scales, and one-on-one interviews, as well as observation of nonverbal cues. Our findings suggest that participants reacted more positively to interactions via telepresence robot than to screen-based video-conferencing.

Collaboration


Dive into the Jeonghye Han's collaboration.

Top Co-Authors

Avatar

Eunja Hyun

Sungkyunkwan University

View shared research outputs
Top Co-Authors

Avatar

Miheon Jo

Cheongju National University of Education

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Namgyu Kim

Pohang University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jennifer L. Burke

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Kensuke Kato

Kyushu University of Health and Welfare

View shared research outputs
Researchain Logo
Decentralizing Knowledge