Aaron B. St. Clair
University of Southern California
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Aaron B. St. Clair.
robot and human interactive communication | 2010
Ross Mead; Eric Wade; Pierre Johnson; Aaron B. St. Clair; Shuya Chen; Maja J. Matarić
New approaches to rehabilitation and health care have developed due to advances in technology and human robot interaction (HRI). Socially assistive robotics (SAR) is a subcategory of HRI that focuses on providing assistance through hands-off interactions. We have developed a SAR architecture that facilitates multiple task-oriented interactions between a user and a robot agent. The architecture accommodates a variety of inputs, tasks, and interaction modalities that are used to provide relevant, real-time feedback to the participant. We have implemented the architecture and validated its technological feasibility in a small pilot study in which a SAR agent led three post-stroke individuals through an exercise scenario. In the following, we present our architecture design, and the results of the feasibility study.
robot and human interactive communication | 2011
Aaron B. St. Clair; Ross Mead; Maja J. Matarić
In many collocated human-robot interaction scenarios, robots are required to accurately and unambiguously indicate an object or point of interest in the environment. Realistic, cluttered environments containing many visually salient targets can present a challenge for the observer of such pointing behavior. In this paper, we describe an experiment and results detailing the effects of visual saliency and pointing modality on human perceptual accuracy of a robots deictic gestures (head and arm pointing) and compare the results to the perception of human pointing.
human-robot interaction | 2015
Aaron B. St. Clair; Maja J. Matarić
We detail an approach to planning effective verbal feedback during pairwise human-robot task collaboration. The approach is motivated by social science literature as well as existing work in robotics and is applicable to a variety of task scenarios. It consists of a dynamic, synthetic task implemented in an augmented reality environment. The result is combined robot task control and speech production, allowing the robot to actively participate and communicate with its teammate. A user study was conducted to experimentally validate the efficacy of the approach on a task in which a single user collaborates with an autonomous robot. The results demonstrate that the approach is capable of improving both objective measures of team performance and the user’s subjective evaluation of both the task and the robot as a teammate. Categories and Subject Descriptors H.1.2 [Models and Principles]: User/Machine Systems—human factors, software psychology; H.5.2 [Information Interfaces and Presentation]: User Interfaces—evaluation/methodology, natural language; I.2.9 [Artificial Intelligence]: Robotics—operator interfaces
collaboration technologies and systems | 2013
Aaron B. St. Clair; Maja J. Matarić
This short summary paper briefly describes a method for using the embodied social communication capabilities of a robot to achieve and enhance coordination in human-robot task collaboration scenarios. The approach focuses on planning coordinating social behaviors using the formalism of roles to allow a robot to produce and interpret communicative feedback expressing a desired allocation of duties and to issue positive or negative reinforcement to a person as the task progresses and in response to the inferred future activity of the collaborating partner.
collaboration technologies and systems | 2011
Aaron B. St. Clair; Maja J. Matarić
In most environments, task collaboration requires efficient, flexible communication between collaborators. In the case of tasks involving human-robot collaboration, the robot must effectively convey and interpret communicative actions about the current and intended state of the task environment and coordinate its behavior with those of its collaborators, human and otherwise. A framework for collaborative communication should not only allow the robot to reason about its actions in relation to those of others but should also support reasoning about the robots own intentions and those ascribed to it by the collaborators. We present such a framework, inspired by Theory of Mind and making use of perspective taking, and show how it could be used to support several collaborative functions, including detection of opportunities to assist.
collaboration technologies and systems | 2014
Aaron B. St. Clair; Maja J. Matarić
This short paper motivates the use of robot verbal feedback in human-robot task collaboration scenarios and presents results from a pilot study aimed at identifying how people use speech to coordinate their actions with each other on a dynamic, collaborative task. From these results, three types of verbal feedback are identified as well as requirements for a robot to correctly employ these speech patterns while collaborating with a person.
Archive | 2009
David J. Feil-Seifer; Matthew P. Black; Elisa Flores; Aaron B. St. Clair; Emily Mower; Chi-Chun Lee; Maja J. Matarić; Shrikanth Narayanan; Clara M. Lajonchere; Peter Mundy; Marian E. Williams
international conference on robotics and automation | 2010
Aaron B. St. Clair; Ross Mead; Maja J. Matarić
national conference on artificial intelligence | 2011
Aaron B. St. Clair; Amin Atrash; Ross Mead; Maja J. Matarić
human robot interaction | 2014
Ross Mead; Amin Atrash; Edward Kaszubski; Aaron B. St. Clair; Jillian Greczek; Caitlyn Clabaugh; Brian Kohan; Maja J. Matarić