Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where AJung Moon is active.

Publication


Featured researches published by AJung Moon.


human-robot interaction | 2014

Meet me where i'm gazing: how shared attention gaze affects human-robot handover timing

AJung Moon; Daniel Troniak; Brian T. Gleeson; Matthew K. X. J. Pan; Minhua Zheng; Benjamin A. Blumer; Karon E. MacLean; Elizabeth A. Croft

In this paper we provide empirical evidence that using humanlike gaze cues during human-robot handovers can improve the timing and perceived quality of the handover event. Handovers serve as the foundation of many human-robot tasks. Fluent, legible handover interactions require appropriate nonverbal cues to signal handover intent, location and timing. Inspired by observations of human-human handovers, we implemented gaze behaviors on a PR2 humanoid robot. The robot handed over water bottles to a total of 102 naïve subjects while varying its gaze behaviour: no gaze, gaze designed to elicit shared attention at the handover location, and the shared attention gaze complemented with a turntaking cue. We compared subject perception of and reaction time to the robot-initiated handovers across the three gaze conditions. Results indicate that subjects reach for the offered object significantly earlier when a robot provides a shared attention gaze cue during a handover. We also observed a statistical trend of subjects preferring handovers with turn-taking gaze cues over the other conditions. Our work demonstrates that gaze can play a key role in improving user experience of human-robot handovers, and help make handovers fast and fluent.Categories and Subject Descriptors I.2.9 [Robotics]: Operator interfaces, Commercial robots and applications; H.1.2 [User/Machine Systems]: Human FactorsGeneral TermsExperimentation, Design, Human Factors, Verification.


International Journal of Social Robotics | 2012

Survey-Based Discussions on Morally Contentious Applications of Interactive Robotics

AJung Moon; Peter Danielson; H. F. Machiel Van der Loos

Introduction: As applications of robotics extend to areas that directly impact human life, such as the military and eldercare, the deployment of autonomous and semi-autonomous robots increasingly requires the input of stakeholder opinions. Up to now, technological deployment has been relying on the guidance of government/military policy and the healthcare system without specific incorporation of professional and lay opinion. Methods: This paper presents results from a roboethics study that uses the unique N-Reasons scenario-based survey instrument. The instrument collected Yes, No, Neutral responses from more than 250 expert and lay responders via the Internet along with their ethics-content reasons for the answers, allowing the respondents to agree to previously-provided reasons or to write their own. Data from three questions relating to military and eldercare robots are analyzed qualitatively and quantitatively. Results: The survey reveals that respondents weigh the appropriateness of robotics technology deployment in concert with the level of autonomy conferred upon it. The accepted level of robot autonomy does not appear to be solely dependent on the perceived efficiency and effectiveness of the technology, but is subject to the robot’s relationship with the public’s principle-based reasons and the application field in focus. Conclusion: The N-Reasons instrument was effective in eliciting ethical commentary in a simple, on-line survey format and provides insights into the interactions between the issues that respondents consider across application and technology boundaries.


intelligent robots and systems | 2011

Did you see it hesitate? - Empirically grounded design of hesitation trajectories for collaborative robots

AJung Moon; Chris A. C. Parker; Elizabeth A. Croft; H. F. Machiel Van der Loos

Unwanted conflicts are inevitable between collaborating agents that share spaces and resources. Motivated by the use of nonverbal communications as a conflict resolution mechanism by humans, this study investigates the communicative capabilities reflected in the trajectory characteristics of hesitation gestures during human-robot collaboration. Hesitation gestures and non-hesitation human arm motions were recorded from a series of reach-and-retract tasks and embodied on a 6-DOF robot arm. A total of 86 survey respondents watched and scored recordings of these motions according to whether they recognized hesitation gestures as exhibited by both the human and the robot. Using the surveys statistical evidence indicating that hesitation trajectories embodied in an articulated robot arm can be recognized by human observers, we identified trajectory characteristics of hesitation gestures. The contribution of our work is an empirically grounded robot trajectory specification that provides communicative cues for conflict resolution during collaborative reaching scenarios.


human robot interaction | 2013

Design and impact of hesitation gestures during human-robot resource conflicts

AJung Moon; Chris A. C. Parker; Elizabeth A. Croft; H. F. Machiel Van der Loos

In collaborative tasks, people often communicate using nonverbal gestures to coordinate actions. When two people reach for the same object at the same time, they often respond to an imminent potential collision with jerky halting hand motions that we term hesitation gestures. Successful implementation of such communicative conflict response behaviour onto robots can be useful. In a myriad of human-robot interaction contexts involving shared spaces and objects, this behaviour can provide a fast and effective means for robots to express awareness of conflict and cede right-of-way during collaborative work with users. Our previous work suggests that when a six-degree-of-freedom (6-DOF) robot traces a simplified trajectory of recorded human hesitation gestures, these robot motions are also perceived by humans as hesitation gestures. In this work, we present a characteristic motion profile derived from the recorded human hesitation motions, called the Acceleration-based Hesitation Profile (AHP). We test its efficacy to generate communicative hesitation responses by a robot in a fast-paced human-robot interaction experiment. Compared to traditional abrupt stopping behaviours, we did not find sufficient evidence that the AHP-based robot responses improve human perception of the robot or human-robot task completion time. However, results from our in situ experiment suggest that subjects can recognize AHP-based robot responses as hesitations and distinguish them to be different from abrupt stopping behaviours.


human factors in computing systems | 2011

Now where was I?: physiologically-triggered bookmarking

Matthew K. X. J. Pan; Jih-Shiang Chang; Gokhan H. Himmetoglu; AJung Moon; Thomas W. Hazelton; Karon E. MacLean; Elizabeth A. Croft

This work explores a novel interaction paradigm driven by implicit, low-attention user control, accomplished by monitoring a users physiological state. We have designed and prototyped this interaction for a first use case of bookmarking an audio stream, to holistically explore the implicit interaction concept. Here, a users galvanic skin conductance (GSR) is monitored for orienting responses (ORs) to external interruptions; our prototype automatically bookmarks the media such that the user can attend to the interruption, then resume listening from the point he/she is interrupted. To test this approachs viability, we addressed questions such as: does GSR exhibit a detectable response to interruptions, and how should the interaction utilize this information? In evaluating this system in a controlled environment, we found an OR detection accuracy of 84%; users provided subjective feedback on its accuracy and utility.


intelligent robots and systems | 2015

Exploring the effect of robot hand configurations in directional gestures for human-robot interaction

Sara Sheikholeslami; AJung Moon; Elizabeth A. Croft

In this work we explore the effectiveness of a three-fingered robotic gripper in accurately expressing directional instructions (move up, down, left, right) as gestures emulating human hand gestures. Such gestures can be necessary in noisy manufacturing environments where verbal communication is ineffective. Three studies are conducted. In Study 1 we explore hand configurations that human dyads use for nonverbal instruction (n = 17). In Study 2 we examine which hand-configurations from Study 1 are most accurately understood by observers (n = 140). In Study 3 we compare performance between a robot arm performing similar motions to those of human study participants using either an unposed or posed three-fingered robotic gripper (n =100) to observe the importance of the hands pose. Recognition rates of directional gestures for both the human and the robot are examined. Results indicate that most gestures are better and more confidently recognized when displayed with the posed robot hand.


human factors in computing systems | 2011

Galvanic skin response-derived bookmarking of an audio stream

Matthew K. X. J. Pan; Gordon Jih-Shiang Chang; Gokhan H. Himmetoglu; AJung Moon; Thomas W. Hazelton; Karon E. MacLean; Elizabeth A. Croft

We demonstrate a novel interaction paradigm driven by implicit, low-attention user control, accomplished by monitoring a users physiological state. We have designed and prototyped this interaction for a first use case of bookmarking an audio stream, to holistically explore the implicit interaction concept. A listeners galvanic skin conductance (GSR) is monitored for orienting responses (ORs) to external interruptions; our research prototype then automatically bookmarks the media such that the user can attend to the interruption, then resume listening from the point heshe is interrupted.


International Journal of Social Robotics | 2018

Impacts of Visual Occlusion and Its Resolution in Robot-Mediated Social Collaborations

Sina Radmard; AJung Moon; Elizabeth A. Croft

In this work, we contribute to the current understanding of human behaviors in telepresence when visual occlusions are introduced in a remote collaboration context. Occlusions can occur when users in remote locations are engaged in physical collaborative tasks. This can yield to frustration and inefficient collaboration between the collaborators. In this work, we aim to design a better user interface to improve remote collaboration experience. We conducted two human-subjects experiments to investigate the following interlinked research questions: (a) what are the impacts of occlusion on remote collaborations, and (b) can an autonomous handling of occlusions improve telepresence collaboration experience for remote users? Results from our preliminary experiment demonstrate that occlusions introduce a significant social interference that necessitates collaborators to reorient or reposition themselves. Subsequently, we conducted a main experiment to evaluate the efficacy of autonomous occlusion handling for remote users. Results from this experiment indicate that the use of an autonomous controller yields a remote user experience that is more comparable (in terms of their vocal non-verbal behaviors [“The vocal non-verbal behaviour includes all spoken cues that surround the verbal message and influence its actual meaning.” (Vinciarelli in Image Vis Comput 27(12):1743–1759, 2009)], task performance and perceived workload) to collaborations performed by two co-located parties. Finally, we discuss the implications of a better controller design for similar robot-mediated social interactions.


The International Journal of Robotics Research | 2017

Cooperative gestures for industry

Sara Sheikholeslami; AJung Moon; Elizabeth A. Croft

Fast and reliable communication between human worker(s) and robotic assistants is essential for successful collaboration between the agents. This is especially true for typically noisy manufacturing environments that render verbal communication less effective. In this work, we investigate the efficacy of nonverbal communication capabilities of robotic manipulators that have poseable, three-fingered end-effectors (hands). We explore the extent to which different poses of a typical robotic gripper can effectively communicate instructional messages during human–robot collaboration. Within the context of a collaborative car door assembly task, we conducted a series of three studies. We first observed the type of hand configurations that humans use to nonverbally instruct another person (Study 1, N = 17); based on the observation, we examined how well human gestures with frequently used hand configurations are understood by recipients of the message (Study 2, N = 140). Finally, we implemented the most human-recognized human hand configurations on a seven-degree-of-freedom robotic manipulator to investigate the efficacy of having human-inspired hand poses on a robotic hand compared to an unposed hand (Study 3, N = 100). Contributions of this work include presentation of a set of hand configurations humans commonly use to instruct another person in a collaborative assembly scenario, as well as recognition rate and recognition confidence measures for the gestures that humans and robots express using different hand configurations. Results indicate that most gestures are better recognized with a higher level of confidence when displayed with a posed robot hand.


Archive | 2012

What should a robot do? : design and implementation of human-like hesitation gestures as a response mechanism for human-robot resource conflicts

AJung Moon

Resource conflict arises when people share spaces and objects with each other. People easily resolve such conflicts using verbal/nonverbal communication. With the advent of robots entering homes and offices, this thesis builds a framework to develop a natural means of managing shared resources in human-robot collaboration contexts. In this thesis, hesitation gestures are developed as a communicative mechanism for robots to respond to human-robot resource conflicts. In the first of the three studies presented in this thesis (Study I), a pilot experiment and six online surveys provided empirical demonstrations that humans perceive hesitations from robot trajectories mimicking human hesitation motions. Using the set of human motions recorded from Study I, a characteristic acceleration profile of hesitation gestures was extracted and distilled into a trajectory design specification representing hesitation, namely the Acceleration-based Hesitation Profile (AHP). In Study II, the efficacy of AHP was tested and validated. In Study III, the impact of AHP-based robot motions was investigated in a HumanRobot Shared-Task (HRST) experiment. The results from these studies indicate that AHP-based robot responses are perceived by human observers to convey hesitation, both in observational and in situ contexts. The results also demonstrate that AHP-based responses, when compared with the abrupt collision avoidance responses typical of industrial robots, do not significantly improve or hinder human perception of the robot and human-robot team performance. The main contribution of this work is an empirically validated trajectory design that can be used to convey a robot’s state of hesitation in real-time to human observers, while achieving the same collision avoidance function as a traditional

Collaboration


Dive into the AJung Moon's collaboration.

Top Co-Authors

Avatar

Elizabeth A. Croft

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Karon E. MacLean

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Matthew K. X. J. Pan

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Chris A. C. Parker

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Gokhan H. Himmetoglu

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Sara Sheikholeslami

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Thomas W. Hazelton

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Benjamin A. Blumer

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Brian T. Gleeson

University of British Columbia

View shared research outputs
Researchain Logo
Decentralizing Knowledge