Brian Scassellati
Yale University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Brian Scassellati.
Computation for metaphors, analogy, and agents | 1999
Rodney A. Brooks; Cynthia Breazeal; Matthew Marjanović; Brian Scassellati; Matthew M. Williamson
To explore issues of developmental structure, physical embodiment, integration of multiple sensory and motor systems, and social interaction, we have constructed an upper-torso humanoid robot called Cog. The robot has twenty-one degrees of freedom and a variety of sensory systems, including visual, auditory, vestibular, kinesthetic, and tactile senses. This chapter gives a background on the methodology that we have used in our investigations, highlights the research issues that have been raised during this project, and provides a summary of both the current state of the project and our long-term goals. We report on a variety of implemented visual-motor routines (smooth-pursuit tracking, saccades, binocular vergence, and vestibular-ocular and opto-kinetic reflexes), orientation behaviors, motor control techniques, and social behaviors (pointing to a visual target, recognizing joint attention through face and eye finding, imitation of head nods, and regulating interaction through expressive feedback). We further outline a number of areas for future research that will be necessary to build a complete embodied system.
Adaptive Behavior | 2000
Cynthia Breazeal; Brian Scassellati
From birth, human infants are immersed in a social environment that allows them to learn by leveraging the skills and capabilities of their caregivers. A critical pre-cursor to this type of social learning is the ability to maintain interaction levels that are neither overwhelming nor under-stim ulating. In this paper, we present a mechanism for an autonomous robot to regulate the intensity of its social interactions with a human. Similar to the feedback from infant to caregiver, the robot uses expressive displays to modulate the interaction intensity. This mechanism is integrated within a general framework that combines perception, attention, drives, emotions, behavior selection, and motor acts. We present a specific implementation of this architecture that enables the robot to react appropriately to both social stimuli (faces) and non-social stimuli (moving toys) while maintaining a suitable interaction intensity. We present results from both face-to-face interactions and interactions mediated through a toy. Note: This paper was submitted in June, 1998.
Trends in Cognitive Sciences | 2002
Cynthia Breazeal; Brian Scassellati
The study of social learning in robotics has been motivated by both scientific interest in the learning process and practical desires to produce machines that are useful, flexible, and easy to use. In this review, we introduce the social and task-oriented aspects of robot imitation. We focus on methodologies for addressing two fundamental problems. First, how does the robot know what to imitate? And second, how does the robot map that perception onto its own action repertoire to replicate it? In the future, programming humanoid robots to perform new tasks might be as simple as showing them.
Annual Review of Biomedical Engineering | 2012
Brian Scassellati; Henny Admoni; Maja J. Matarić
Autism spectrum disorders are a group of lifelong disabilities that affect peoples ability to communicate and to understand social cues. Research into applying robots as therapy tools has shown that robots seem to improve engagement and elicit novel social behaviors from people (particularly children and teenagers) with autism. Robot therapy for autism has been explored as one of the first application domains in the field of socially assistive robotics (SAR), which aims to develop robots that assist people with special needs through social interactions. In this review, we discuss the past decades work in SAR systems designed for autism therapy by analyzing robot design decisions, human-robot interactions, and system evaluations. We conclude by discussing challenges and future trends for this young but rapidly developing research area.
intelligent robots and systems | 1999
Cynthia Breazeal; Brian Scassellati
In order to interact socially with a human, a robot must convey intentionality, that is, the human must believe that the robot has beliefs, desires, and intentions. We have constructed a robot which exploits natural human social tendencies to convey intentionality through motor actions and facial expressions. We present results on the integration of perception, attention, motivation, behavior, and motor systems which allow the robot to engage in infant-like interactions with a human caregiver.
Autonomous Robots | 2002
Brian Scassellati
If we are to build human-like robots that can interact naturally with people, our robots must know not only about the properties of objects but also the properties of animate agents in the world. One of the fundamental social skills for humans is the attribution of beliefs, goals, and desires to other people. This set of skills has often been called a “theory of mind.” This paper presents the theories of Leslie (1994) and Baron-Cohen (1995) on the development of theory of mind in human children and discusses the potential application of both of these theories to building robots with similar capabilities. Initial implementation details and basic skills (such as finding faces and eyes and distinguishing animate from inanimate stimuli) are introduced. I further speculate on the usefulness of a robotic implementation in evaluating and comparing these two models.
ISRR | 2007
Brian Scassellati
Autism is a pervasive developmental disorder that is characterized by social and communicative impairments. Social robots recognize and respond to human social cues with appropriate behaviors. Social robots, and the technology used in their construction, can be unique tools in the study of autism. Based on three years of integration and immersion with a clinical research group, this paper discusses how social robots will make an impact on the ways in which we diagnose, treat, and understand autism.
IEEE Intelligent Systems & Their Applications | 2000
Bryan Adams; Cynthia Breazeal; Rodney A. Brooks; Brian Scassellati
Aside from their traditional roles, humanoid robots can be used to explore theories of human intelligence. The authors discuss their project aimed at developing robots that can behave like and interact with humans.
systems man and cybernetics | 2001
Cynthia Breazeal; Aaron Edsinger; Paul Fitzpatrick; Brian Scassellati
Ballard (1991) described the implications of having a visual system that could actively position the camera coordinates in response to physical stimuli. In humanoid robotic systems, or in any animate vision system that interacts with people, social dynamics provide additional levels of constraint and additional opportunities for processing economy. In this paper, we describe an integrated visual-motor system that was implemented on a humanoid robot to negotiate the robots physical constraints, the perceptual needs of the robots behavioral and motivational systems, and the social implications of the motor acts.
International Journal of Social Robotics | 2011
Wilma Bainbridge; Justin W. Hart; Elizabeth S. Kim; Brian Scassellati
This paper explores how a robot’s physical presence affects human judgments of the robot as a social partner. For this experiment, participants collaborated on simple book-moving tasks with a humanoid robot that was either physically present or displayed via a live video feed. Multiple tasks individually examined the following aspects of social interaction: greetings, cooperation, trust, and personal space. Participants readily greeted and cooperated with the robot whether present physically or in live video display. However, participants were more likely both to fulfill an unusual request and to afford greater personal space to the robot when it was physically present, than when it was shown on live video. The same was true when the live video displayed robot’s gestures were augmented with disambiguating 3-D information. Questionnaire data support these behavioral findings and also show that participants had an overall more positive interaction with the physically present robot.