Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Walter Dan Stiehl is active.

Publication


Featured researches published by Walter Dan Stiehl.


robot and human interactive communication | 2005

Design of a therapeutic robotic companion for relational, affective touch

Walter Dan Stiehl; Jeff Lieberman; Cynthia Breazeal; Louis Basel; Levi Lalla; Michael M. Wolf

Much research has shown the positive health benefits of companion animals. Unfortunately these animals are not always available to patients due to allergies, risk of disease, or other reasons. Recently, this application domain has attracted attention of robotics researchers. The Huggable is a new type of robotic companion capable of active relational and affective touch-based interactions with a person. It features a high number of somatic sensors (electric field, temperature, and force) over the entire surface of the robot, underneath a soft silicons skin and fur fabric covering. This paper describes the design and early results in recognizing affective content of touch for this robot.


affective computing and intelligent interaction | 2005

Affective touch for robotic companions

Walter Dan Stiehl; Cynthia Breazeal

As robotic platforms are designed for human robot interaction applications, a full body sense of touch, or “sensitive skin,” becomes important. The Huggable is a new type of therapeutic robotic companion based upon relational touch interactions. The initial use of neural networks to classify the affective content of touch is described.


interaction design and children | 2009

The huggable: a platform for research in robotic companions for pediatric care

Walter Dan Stiehl; Jun Ki Lee; Cynthia Breazeal; Marco Nalin; Angelica Morandi; Alberto Sanna

Robotic companions offer a unique combination of embodiment and computation which open many new interesting opportunities in the field of pediatric care. As these new technologies are developed, we must consider the central research questions of how such systems should be designed and what the appropriate applications for such systems are. In this paper we present the Huggable, a robotic companion in the form factor of a teddy bear and outline a series of studies we are planning to run using the Huggable in a pediatric care unit.


intelligent robots and systems | 2009

Real-time social touch gesture recognition for sensate robots

Heather Knight; Robert Lopez Toscano; Walter Dan Stiehl; Angela Chang; Yi Wang; Cynthia Breazeal

This paper describes the hardware and algorithms for a realtime social touch gesture recognition system. Early experiments involve a Sensate Bear test-rig with full body touch sensing, sensor visualization and gesture recognition capabilities. Algorithms are based on real humans interacting with a plush bear. In developing a preliminary gesture library with thirteen Symbolic Gestures and eight Touch Subtypes, we have taken the first steps toward a Robotic Touch API, showing that the Huggable robot behavior system will be able to stream currently active sensors to detect regional social gestures and local sub-gestures in realtime. The system demonstrates the infrastructure to detect three types of touching: social touch, local touch, and sensor-level touch.


robot and human interactive communication | 2008

The design of a semi-autonomous robot avatar for family communication and education

Jun Ki Lee; Robert Lopez Toscano; Walter Dan Stiehl; Cynthia Breazeal

Robots as an embodied, multi-modal technology have great potential to be used as a new type of communication device. In this paper we outline our development of the Huggable robot as a semi-autonomous robot avatar for two specific types of remote interaction - family communication and education. Through our discussion we highlight how we have applied six important elements in our system to allow for the robot to function as a richly embodied communication channel.


Advanced Robotics | 2009

Semi-Autonomous Robot Avatar as a Medium for Family Communication and Education

Jun Ki Lee; Walter Dan Stiehl; Robert Lopez Toscano; Cynthia Breazeal

Robots as an embodied, multi-modal technology have great potential to be used as a new type of communication device. In this paper we outline our development of the Huggable robot as a semi-autonomous robot avatar for two specific types of remote interaction — family communication and education. We also describe three different operator control interfaces (Web Interface, Wearable Interface and Sympathetic Interface) being developed to explore how these systems will impact the experience. Furthermore, through our discussion we highlight how we have applied five important elements in our system to allow the robot to function as a richly embodied communication channel. These five elements include sharing and directing attention, situational awareness through real-time sensor feedback, alleviating the cognitive load of a user, conveying the personality and character of the robot and global accessibility. Lastly, we provide results from a pilot study of the Web Interface.


international conference on computer graphics and interactive techniques | 2006

The huggable: a new type of therapeutic robotic companion

Walter Dan Stiehl; Cynthia Breazeal; Kuk-hyun Han; Jeff Lieberman; Levi Lalla; Allan Z. Maymin; Jonathan Salinas; Daniel Fuentes; Robert Lopez Toscano; Cheng Hau Tong; Aseem Kishore

Much research has shown the many positive benefits of companion animal therapy in improving the lives of people in hospitals and nursing home facilities (Allen, Blascovich et al. 1991). Unfortunately, in many facilities companion animal therapy is not offered due to fears of allergies, bites, or disease. Even in facilities that do offer this form of therapy, it is only offered for a few hours each day once or twice a week with a trained professional present at all times. As a response to these restrictions robot assisted therapy, using robots such as Sony’s AIBO and the Paro (Wada, Shibata et al. 2002) has emerged for cases in which companion animals are not available. These current robotic companions lack a full body sense of touch capable of understanding the relational and affective content provided to the robot, such as if the robot is held in someone’s arms, tickled, or petted. These aspects of touch are one of the ways in which companion animals provide comfort.


Archive | 2008

Interactive systems employing robotic companions

Walter Dan Stiehl; Cynthia Breazeal; Jun Ki Lee; Allan Z. Maymin; Heather Knight; Robert Lopez Toscano; Iris M. Cheung


consumer communications and networking conference | 2006

The huggable: a therapeutic robotic companion for relational, affective touch

Walter Dan Stiehl; Jeff Lieberman; Cynthia Breazeal; L. Basel; R. Cooper; Heather Knight; Levi Lalla; Allan Z. Maymin; S. Purchase


international conference on robotics and automation | 2004

A "somatic alphabet" approach to "sensitive skin"

Walter Dan Stiehl; Levi Lalla; Cynthia Breazeal

Collaboration


Dive into the Walter Dan Stiehl's collaboration.

Top Co-Authors

Avatar

Cynthia Breazeal

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Robert Lopez Toscano

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jun Ki Lee

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Levi Lalla

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Allan Z. Maymin

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Heather Knight

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jeff Lieberman

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Angela Chang

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Aseem Kishore

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Cheng Hau Tong

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge