Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeff Lieberman is active.

Publication


Featured researches published by Jeff Lieberman.


IEEE Transactions on Robotics | 2007

TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning

Jeff Lieberman; Cynthia Breazeal

When humans learn a new motor skill from a teacher, they learn using multiple channels. They receive high level information aurally about the skill, visual information about how another performs the skill, and at times, tactile information from the teachers physical guidance. This research proposes a novel approach where the student receives real-time tactile feedback, simultaneously over all joints, delivered through a wearable robotic system. This tactile feedback can supplement the visual or auditory feedback from the teacher. Our results using a 5-DOF robotic suit show a 27% improvement in accuracy while performing the target motion, and an accelerated learning rate of up to 23%. We report both of these results with high statistical significance (p les 0.01). This research is intended for use in a diverse set of applications including sports training, motor rehabilitation after neurological damage, dance, postural retraining for health, and many others. We call this system tactile interaction for kinesthetic learning (TIKL).


robot and human interactive communication | 2005

Design of a therapeutic robotic companion for relational, affective touch

Walter Dan Stiehl; Jeff Lieberman; Cynthia Breazeal; Louis Basel; Levi Lalla; Michael M. Wolf

Much research has shown the positive health benefits of companion animals. Unfortunately these animals are not always available to patients due to allergies, risk of disease, or other reasons. Recently, this application domain has attracted attention of robotics researchers. The Huggable is a new type of robotic companion capable of active relational and affective touch-based interactions with a person. It features a high number of somatic sensors (electric field, temperature, and force) over the entire surface of the robot, underneath a soft silicons skin and fur fabric covering. This paper describes the design and early results in recognizing affective content of touch for this robot.


International Journal of Humanoid Robotics | 2004

TUTELAGE AND COLLABORATION FOR HUMANOID ROBOTS

Cynthia Breazeal; Andrew G. Brooks; Jesse Gray; Guy Hoffman; Cory D. Kidd; Hans Lee; Jeff Lieberman; Andrea Lockerd; David Chilongo

This paper presents an overview of our work towards building socially intelligent, cooperative humanoid robots that can work and learn in partnership with people. People understand each other in social terms, allowing them to engage others in a variety of complex social interactions including communication, social learning, and cooperation. We present our theoretical framework that is a novel combination of Joint Intention Theory and Situated Learning Theory and demonstrate how this framework can be applied to develop our sociable humanoid robot, Leonardo. We demonstrate the robots ability to learn quickly and effectively from natural human instruction using gesture and dialog, and then cooperate to perform a learned task jointly with a person. Such issues must be addressed to enable many new and exciting applications for robots that require them to play a long-term role in peoples daily lives.


human factors in computing systems | 2006

TapTap: a haptic wearable for asynchronous distributed touch therapy

Leonardo Bonanni; Cati Vaucelle; Jeff Lieberman; Orit Zuckerman

TapTap is a wearable haptic system that allows nurturing human touch to be recorded, broadcast and played back for emotional therapy. Haptic input/output modules in a convenient modular scarf provide affectionate touch that can be personalized. We present a working prototype informed by a pilot study.


robot and human interactive communication | 2005

Action parsing and goal inference using self as simulator

Jesse Gray; Cynthia Breazeal; Matt Berlin; Andrew G. Brooks; Jeff Lieberman

The ability to understand a teammates actions in terms of goals and other mental states is an important element of cooperative behavior. Simulation theory argues in favor of an embodied approach whereby humans reuse parts of their cognitive structure for not only generating behavior, but also for simulating the mental states responsible for generating that behavior in others. We present our simulation-theoretic approach and demonstrates its performance in a collaborative task scenario. The robot offers its human teammate assistance by either inferring the humans belief states to anticipate their informational needs, or inferring the humans goal states to physically help the human achieve those goals.


ieee-ras international conference on humanoid robots | 2004

Working collaboratively with humanoid robots

Cynthia Breazeal; Andrew G. Brooks; David Chilongo; Jesse Gray; Guy Hoffman; Cory D. Kidd; Hans Lee; Jeff Lieberman; Andrea Lockerd

This paper presents an overview of our work towards building humanoid robots that can work alongside people as cooperative teammates. We present our theoretical framework based on a novel combination of joint intention theory and collaborative discourse theory, and demonstrate how it can be applied to allow a human to work cooperatively with a humanoid robot on a joint task using speech, gesture, and expressive cues. Such issues must be addressed to enable many new and exciting applications for humanoid robots that require them to assist ordinary people in daily activities or to work as capable members of human-robot teams.


international conference on robotics and automation | 2007

Development of a Wearable Vibrotactile Feedback Suit for Accelerated Human Motor Learning

Jeff Lieberman; Cynthia Breazeal

When a human learns a new motor skill from a teacher, they learn using multiple channels: They receive high level information aurally about the skill, visual information about how another performs the skill, and at times, tactile information, from a teachers physical guidance of the student. This research proposes a novel approach, the application of this tactile feedback through a robotic wearable system, while a student tries to learn from a teacher. Initial tests on a 5-DOF robotic suit show a decrease in motion errors of over 20%, and an accelerated learning rate of 7%, both conservative given the system setup and statistically very significant (p les 0.01). This research is intended in use of sports training, motor rehabilitation after neurological damage, dance, postural retraining for health, and many other contexts.


ieee-ras international conference on humanoid robots | 2004

Improvements on action parsing and action interpolation for learning through demonstration

Jeff Lieberman; Cynthia Breazeal

Programming humanoid robots with new motor skills through human demonstration is a promising approach to endowing humanoids with new capabilities in a relatively quick and intuitive manner. This paper presents an automated software system to enable our humanoid robot to learn a generalized dexterous motor skill from relatively few demonstrations provided by a human operator wearing a telemetry suit. Movement, end-effector, stereovision, and tactile information are analyzed to automatically segment movement streams along goal-directed boundaries. Further combinatorial selection of subsets of markers allows final episodic boundary selection and time alignment of tasks. The task trials are then analyzed spatially using radial basis functions [RBFs] to interpolate between demonstrations using the position of the target object as the motion blending parameter. A secondary RBF solution, using end-effector paths in the object coordinate frame, provides precise end-effector positioning and orienting relative to the object. Blending of these two solutions is shown to both preserve quality of motion while increasing accuracy and robustness of object manipulation.


international conference on computer graphics and interactive techniques | 2006

The huggable: a new type of therapeutic robotic companion

Walter Dan Stiehl; Cynthia Breazeal; Kuk-hyun Han; Jeff Lieberman; Levi Lalla; Allan Z. Maymin; Jonathan Salinas; Daniel Fuentes; Robert Lopez Toscano; Cheng Hau Tong; Aseem Kishore

Much research has shown the many positive benefits of companion animal therapy in improving the lives of people in hospitals and nursing home facilities (Allen, Blascovich et al. 1991). Unfortunately, in many facilities companion animal therapy is not offered due to fears of allergies, bites, or disease. Even in facilities that do offer this form of therapy, it is only offered for a few hours each day once or twice a week with a trained professional present at all times. As a response to these restrictions robot assisted therapy, using robots such as Sony’s AIBO and the Paro (Wada, Shibata et al. 2002) has emerged for cases in which companion animals are not available. These current robotic companions lack a full body sense of touch capable of understanding the relational and affective content provided to the robot, such as if the robot is held in someone’s arms, tickled, or petted. These aspects of touch are one of the ways in which companion animals provide comfort.


Archive | 2004

HUMANOID ROBOTS AS COOPERATIVE PARTNERS FOR PEOPLE

Cynthia Breazeal; Andrew G. Brooks; Jesse Gray; Guy Hoffman; Cory D. Kidd; Hans Lee; Jeff Lieberman; Andrea Lockerd; David Mulanda

Collaboration


Dive into the Jeff Lieberman's collaboration.

Top Co-Authors

Avatar

Cynthia Breazeal

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrew G. Brooks

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jesse Gray

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Andrea Lockerd

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Cory D. Kidd

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hans Lee

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Levi Lalla

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Walter Dan Stiehl

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Allan Z. Maymin

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge