Derek McColl
University of Toronto
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Derek McColl.
Assistive Technology | 2014
Wing-Yue Geoffrey Louie; Derek McColl; Goldie Nejat
Recent studies have shown that cognitive and social interventions are crucial to the overall health of older adults including their psychological, cognitive, and physical well-being. However, due to the rapidly growing elderly population of the world, the resources and people to provide these interventions is lacking. Our work focuses on the use of social robotic technologies to provide person-centered cognitive interventions. In this article, we investigate the acceptance and attitudes of older adults toward the human-like expressive socially assistive robot Brian 2.1 in order to determine if the robot’s human-like assistive and social characteristics would promote the use of the robot as a cognitive and social interaction tool to aid with activities of daily living. The results of a robot acceptance questionnaire administered during a robot demonstration session with a group of 46 elderly adults showed that the majority of the individuals had positive attitudes toward the socially assistive robot and its intended applications.
IEEE Robotics & Automation Magazine | 2013
Derek McColl; Wing-Yue Geoffrey Louie; Goldie Nejat
As the worlds elderly population continues to grow, so does the number of individuals diagnosed with cognitive impairments. It is estimated that 115 million people will have age-related memory loss by 2050 [1]. The number of older adults who have difficulties performing self-care and independent-living activities increases significantly with the prevalence of cognitive impairment. This is especially true for the population over 70 years of age [2]. Cognitive impairment, as a result of dementia, severely affects a persons ability to independently initiate and perform daily activities, as cognitive abilities can be diminished [3]. If a person is incapable of performing these activities, continuous assistance from others is necessary. In 2010, the total worldwide cost of dementia (including medical, social, and informal care costs) was estimated to be US
human robot interaction | 2013
Derek McColl; Goldie Nejat
604 billion [1].
International Journal of Social Robotics | 2011
Derek McColl; Zhe Zhang; Goldie Nejat
As people get older, their ability to perform basic self-maintenance activities can be diminished due to the prevalence of cognitive and physical impairments or as a result of social isolation. The objective of our work is to design socially assistive robots capable of providing cognitive assistance, targeted engagement, and motivation to elderly individuals, in order to promote participation in self-maintenance activities of daily living. In this paper, we present the design and implementation of the expressive human-like robot, Brian 2.1, as a social motivator for the important activity of eating meals. An exploratory study was conducted at an elderly care facility with the robot and eight individuals, aged 82--93, to investigate user engagement and compliance during meal-time interactions with the robot along with overall acceptance and attitudes towards the robot. Results of the study show that the individuals were both engaged in the interactions and complied with the robot during two different meal-eating scenarios. A post-study robot acceptance questionnaire also determined that, in general, the participants enjoyed interacting with Brian 2.1 and had positive attitudes towards the robot for the intended activity.
International Journal of Social Robotics | 2014
Derek McColl; Goldie Nejat
A novel breed of robots known as socially assistive robots is emerging. These robots are capable of providing assistance to individuals through social and cognitive interaction. However, there are a number of research issues that need to be addressed in order to design such robots. In this paper, we address one main challenge in the development of intelligent socially assistive robots: The robot’s ability to identify human non-verbal communication during assistive interactions. In particular, we present a unique non-contact and non-restricting automated sensor-based approach for identification and categorization of human upper body language in determining how accessible a person is to the robot during natural real-time human-robot interaction (HRI). This classification will allow a robot to effectively determine its own reactive task-driven behavior during assistive interactions. Human body language is an important aspect of communicative nonverbal behavior. Body pose and position can play a vital role in conveying human intent, moods, attitudes and affect. Preliminary experiments show the potential of integrating the proposed body language recognition and classification technique into socially assistive robotic systems partaking in HRI scenarios.
Journal of Intelligent and Robotic Systems | 2016
Derek McColl; Alexander Hong; Naoaki Hatakeyama; Goldie Nejat; Beno Benhabib
Natural social human–robot interactions (HRIs) require that robots have the ability to perceive and identify complex human social behaviors and, in turn, be able to also display their own behaviors using similar communication modes. Recently, it has been found that body language plays an important role in conveying information about changes in human emotions during human–human interactions. Our work focuses on extending this concept to robotic affective communication during social HRI. Namely, in this paper, we explore the design of emotional body language for our human-like social robot, Brian 2.0. We develop emotional body language for the robot using a variety of body postures and movements identified in human emotion research. To date, only a handful of researchers have focused on the use of robotic body language to display emotions, with a significant emphasis being on the display of emotions through dance. Such emotional dance can be effective for small robots with large workspaces, however, it is not as appropriate for life-sized robots such as Brian 2.0 engaging in one-on-one interpersonal social interactions with a person. Experiments are presented to evaluate the feasibility of the robot’s emotional body language based on human recognition rates. Furthermore, a unique comparison study is presented to investigate the perception of human body language features displayed by the robot with respect to the same body language features displayed by a human actor.
IEEE Transactions on Systems, Man, and Cybernetics | 2017
Derek McColl; Chuan Jiang; Goldie Nejat
In Human-Robot Interactions (HRI), robots should be socially intelligent. They should be able to respond appropriately to human affective and social cues in order to effectively engage in bi-directional communications. Social intelligence would allow a robot to relate to, understand, and interact and share information with people in real-world human-centered environments. This survey paper presents an encompassing review of existing automated affect recognition and classification systems for social robots engaged in various HRI settings. Human-affect detection from facial expressions, body language, voice, and physiological signals are investigated, as well as from a combination of the aforementioned modes. The automated systems are described by their corresponding robotic and HRI applications, the sensors they employ, and the feature detection techniques and affect classification strategies utilized. This paper also discusses pertinent future research directions for promoting the development of socially intelligent robots capable of recognizing, classifying and responding to human affective states during real-time HRI.
intelligent robots and systems | 2014
Derek McColl; Goldie Nejat
For social robots to be successfully integrated and accepted within society, they need to be able to interpret human social cues that are displayed through natural modes of communication. In particular, a key challenge in the design of social robots is developing the robots ability to recognize a persons affective states (emotions, moods, and attitudes) in order to respond appropriately during social human-robot interactions (HRIs). In this paper, we present and discuss social HRI experiments we have conducted to investigate the development of an accessibility-aware social robot able to autonomously determine a persons degree of accessibility (rapport, openness) toward the robot based on the persons natural static body language. In particular, we present two one-on-one HRI experiments to: 1) determine the performance of our automated system in being able to recognize and classify a persons accessibility levels and 2) investigate how people interact with an accessibility-aware robot which determines its own behaviors based on a persons speech and accessibility levels.
robot and human interactive communication | 2012
Wing-Yue Geoffrey Louie; Derek McColl; Goldie Nejat
Our research focuses on the development of a socially assistive robot to provide cognitive and social stimulation during meal-time scenarios in order to promote proper nutrition amongst the elderly. In this paper, we present the design of a novel automated affect recognition and classification system that will allow the robot to interpret natural displays of affective human body language during such one-on-one assistive scenarios. Namely, we identify appropriate body language features and learning-based classifiers that can be utilized for accurate affect estimation. A robot can then utilize this information in order to determine its own appropriate responsive behaviors to keep people engaged in this crucial activity. One-on-one assistive meal-time experiments were conducted with the robot Brian 2.1 and elderly participants at a long-term care facility. The results showed the potential of utilizing the automated affect recognition and classification system to identify and classify natural affective body language features into valence and arousal values using learning-based classifiers. The elderly users displayed a number of affective states, further motivating the use of the affect estimation system.
ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference | 2011
Derek McColl; Goldie Nejat
Studies have shown that cognitive and social stimulation is crucial to the overall health of older adults including psychological, cognitive and physical well-being. However, activities to promote such stimulation are often lacking in long-term care facilities. Our work focuses on the use of social robotic technologies to provide person-centered cognitive interventions. Namely, this paper presents an HRI study with the unique human-like socially assistive robot Brian 2.1, in order to investigate the use and acceptability of the expressive human-like robot by older adults living in a longterm care center. Current studies with social robots for the elderly have been mainly directed towards collecting data on the acceptance and use of animal-like robots. Herein, we aim to determine if the robots human-like assistive and social characteristics result in the elderly having positive attitudes towards the robot as well as accepting it as an interactive cognitive training tool.