Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jesse Gray is active.

Publication


Featured researches published by Jesse Gray.


International Journal of Humanoid Robotics | 2004

TUTELAGE AND COLLABORATION FOR HUMANOID ROBOTS

Cynthia Breazeal; Andrew G. Brooks; Jesse Gray; Guy Hoffman; Cory D. Kidd; Hans Lee; Jeff Lieberman; Andrea Lockerd; David Chilongo

This paper presents an overview of our work towards building socially intelligent, cooperative humanoid robots that can work and learn in partnership with people. People understand each other in social terms, allowing them to engage others in a variety of complex social interactions including communication, social learning, and cooperation. We present our theoretical framework that is a novel combination of Joint Intention Theory and Situated Learning Theory and demonstrate how this framework can be applied to develop our sociable humanoid robot, Leonardo. We demonstrate the robots ability to learn quickly and effectively from natural human instruction using gesture and dialog, and then cooperate to perform a learned task jointly with a person. Such issues must be addressed to enable many new and exciting applications for robots that require them to play a long-term role in peoples daily lives.


international conference on computer graphics and interactive techniques | 2006

The huggable: a therapeutic robotic companion for relational, affective touch

Walter Dan Stiehl; Cynthia Breazeal; Kuk-hyun Han; Jeff Lieberman; Levi Lalla; Allan Z. Maymin; Jonathan Salinas; Daniel Fuentes; Robert Lopez Toscano; Cheng Hau Tong; Aseem Kishore; Matt Berlin; Jesse Gray

Numerous studies have shown the positive benefits of companion animal therapy. Unfortunately, companion animals are not always available. The Huggable is a new type of robotic companion being designed specifically for such cases. It features a full body “sensitive skin” for relational affective touch, silent, muscle-like, voice coil actuators, an embedded PC with data collection and networking capabilities. In this paper we briefly describe the Huggable and propose a live demonstration of the robot.


Robotics and Autonomous Systems | 2006

Using perspective taking to learn from ambiguous demonstrations

Cynthia Breazeal; Matt Berlin; Andrew G. Brooks; Jesse Gray; Andrea Lockerd Thomaz

Abstract This paper addresses an important issue in learning from demonstrations that are provided by “naive” human teachers—people who do not have expertise in the machine learning algorithms used by the robot. We therefore entertain the possibility that, whereas the average human user may provide sensible demonstrations from a human’s perspective, these same demonstrations may be insufficient, incomplete, ambiguous, or otherwise “flawed” from the perspective of the training set needed by the learning algorithm to generalize properly. To address this issue, we present a system where the robot is modeled as a socially engaged and socially cognitive learner. We illustrate the merits of this approach through an example where the robot is able to correctly learn from “flawed” demonstrations by taking the visual perspective of the human instructor to clarify potential ambiguities.


The International Journal of Robotics Research | 2009

An Embodied Cognition Approach to Mindreading Skills for Socially Intelligent Robots

Cynthia Breazeal; Jesse Gray; Matt Berlin

Future applications for personal robots motivate research into developing robots that are intelligent in their interactions with people. Toward this goal, in this paper we present an integrated socio-cognitive architecture to endow an anthropomorphic robot with the ability to infer mental states such as beliefs, intents, and desires from the observable behavior of its human partner. The design of our architecture is informed by recent findings from neuroscience and embodies cognition that reveals how living systems leverage their physical and cognitive embodiment through simulation-theoretic mechanisms to infer the mental states of others. We assess the robots mindreading skills on a suite of benchmark tasks where the robot interacts with a human partner in a cooperative scenario and a learning scenario. In addition, we have conducted human subjects experiments using the same task scenarios to assess human performance on these tasks and to compare the robots performance with that of people. In the process, our human subject studies also reveal some interesting insights into human behavior.


robot and human interactive communication | 2005

Action parsing and goal inference using self as simulator

Jesse Gray; Cynthia Breazeal; Matt Berlin; Andrew G. Brooks; Jeff Lieberman

The ability to understand a teammates actions in terms of goals and other mental states is an important element of cooperative behavior. Simulation theory argues in favor of an embodied approach whereby humans reuse parts of their cognitive structure for not only generating behavior, but also for simulating the mental states responsible for generating that behavior in others. We present our simulation-theoretic approach and demonstrates its performance in a collaborative task scenario. The robot offers its human teammate assistance by either inferring the humans belief states to anticipate their informational needs, or inferring the humans goal states to physically help the human achieve those goals.


ieee-ras international conference on humanoid robots | 2004

Working collaboratively with humanoid robots

Cynthia Breazeal; Andrew G. Brooks; David Chilongo; Jesse Gray; Guy Hoffman; Cory D. Kidd; Hans Lee; Jeff Lieberman; Andrea Lockerd

This paper presents an overview of our work towards building humanoid robots that can work alongside people as cooperative teammates. We present our theoretical framework based on a novel combination of joint intention theory and collaborative discourse theory, and demonstrate how it can be applied to allow a human to work cooperatively with a humanoid robot on a joint task using speech, gesture, and expressive cues. Such issues must be addressed to enable many new and exciting applications for humanoid robots that require them to assist ordinary people in daily activities or to work as capable members of human-robot teams.


international conference on computer graphics and interactive techniques | 2008

Mobile, dexterous, social robots for mobile manipulation and human-robot interaction

Cynthia Breazeal; Michael Siegel; Matt Berlin; Jesse Gray; Roderic A. Grupen; Patrick Deegan; Jeff Weber; Kailas Narendran; John McBean

The purpose of this platform is to support research and education goals in human-robot interaction and mobile manipulation with applications that require the integration of these abilities. In particular, our research aims to develop personal robots that work with people as capable teammates to assist in eldercare, healthcare, domestic chores, and other physical tasks that require robots to serve as competent members of human-robot teams. The robot’s small, agile design is particularly well suited to human-robot interaction and coordination in human living spaces. Our collaborators include the Laboratory for Perceptual Robotics at the University of Massachusetts at Amherst, Xitome Design, Meka Robotics, and digitROBOTICS.


intelligent robots and systems | 2003

Interactive robot theatre

Cynthia Breazeal; Andrew G. Brooks; Jesse Gray; Matt Hancher; Cory D. Kidd; John McBean; Dan Stiehl; Joshua Strickon

This work motivates interactive robot theatre as an interesting test bed to explore research issues in the development of sociable robots and to investigate the relationship between autonomous robots and intelligent environments. We present the implementation of our initial exploration in this area highlighting three core technologies. First, an integrated show control software development platform for the design and control of an intelligent stage. Second, a stereo vision system that tracks multiple features on multiple audience participants in real-time. Third, an interactive, autonomous robot performer with natural and expressive movement that combines techniques from character animation and robot control.


human robot interaction | 2016

Tega: A Social Robot

Jacqueline Kory Westlund; Jin Joo Lee; Luke Plummer; Fardad Faridi; Jesse Gray; Matt Berlin; Harald Quintus-Bosz; Robert Hartmann; Mike Hess; Stacy Dyer; Kristopher dos Santos; Sigurdur Orn Adalgeirsson; Goren Gordon; Samuel Spaulding; Marayna Martinez; Madhurima Das; Maryam Archie; Sooyeon Jeong; Cynthia Breazeal

Tega is a new expressive “squash and stretch”, Android-based social robot platform, designed to enable long-term interactions with children.


tangible and embedded interaction | 2011

Exploring mixed reality robot gaming

David Robert; Ryan Wistorrt; Jesse Gray; Cynthia Breazeal

We describe an interactive, mixed reality (MR) robot gaming platform in which the user controls a tangible, physically embodied character. Miso, an expressive tele-operated robot plays with its virtual peers by passing a graphical object back and forth seamlessly through an integrated physical and virtual environment. Special emphasis is placed on the importance of maintaining perceptual continuity by closely coupling the simulated worlds physical laws to our material reality. We present our implemented MR robot gaming environment and describe the design of an interreality portal at the boundary of the physical and virtual realities.

Collaboration


Dive into the Jesse Gray's collaboration.

Top Co-Authors

Avatar

Cynthia Breazeal

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew G. Brooks

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matt Berlin

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeff Lieberman

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew R. Berlin

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Bill Tomlinson

University of California

View shared research outputs
Top Co-Authors

Avatar

Cory D. Kidd

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrea Lockerd

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Bruce Blumberg

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge