Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marissa McCoy is active.

Publication


Featured researches published by Marissa McCoy.


human-robot interaction | 2015

Comparing Models of Disengagement in Individual and Group Interactions

Iolanda Leite; Marissa McCoy; Daniel Ullman; Nicole Salomons; Brian Scassellati

Changes in type of interaction (e.g., individual vs. group interactions) can potentially impact data-driven models developed for social robots. In this paper, we provide a first investigation in the effects of changing group size in datadriven models for HRI, by analyzing how a model trained on data collected from participants interacting individually performs in test data collected from group interactions, and \textit{vice-versa. Another model combining data from both individual and group interactions is also investigated. We perform these experimentsin the context of predicting disengagement behaviors in children interacting with two social robots. Our results show that a model trained with group data generalizes better to individual participants than the other way around. The mixed model seems a good compromise, but it does not achieve the performance levels of the models trained for a specific type of interaction.


human-robot interaction | 2015

Emotional Storytelling in the Classroom: Individual versus Group Interaction between Children and Robots

Iolanda Leite; Marissa McCoy; Monika Lohani; Daniel Ullman; Nicole Salomons; Charlene K. Stokes; Susan E. Rivers; Brian Scassellati

Robot assistive technology is becoming increasingly prevalent. Despite the growing body of research in this area, the role of type of interaction (i.e., small groups versus individual interactions) on effectiveness of interventions is still unclear. In this paper, we explore a new direction for socially assistive robotics, where multiple robotic characters interact with children in an interactive storytelling scenario. We conducted a between-subjects repeated interaction study where a single child or a group of three children interacted with the robots in an interactive narrative scenario. Results show that although the individual condition increased participant’s story recall abilities compared to the group condition, the emotional interpretation of the story content seemed more dependent on the difficulty level rather than the study condition. Our findings suggest that, despite the type of interaction, interactive narratives with multiple robots are a promising approach to foster children’s development of social-related skills.


Comparative Education Review | 2015

Cumulative Risk and Teacher Well-Being in the Democratic Republic of the Congo

Sharon Wolf; Catalina Torrente; Marissa McCoy; Damira Rasheed; J. Lawrence Aber

Remarkably little systematic research has examined the living and working conditions for teachers in sub-Saharan Africa and how such conditions predict teacher well-being. This study assesses how various risks across several domains of teachers’ lives—measured as a cumulative risk index—predict motivation, burnout, and job dissatisfaction in the Katanga province of the Democratic Republic of the Congo. Cumulative risk is related to lower motivation and higher burnout levels, and the relationship between cumulative risk and burnout is moderated by years of teaching experience. Specifically, less experienced teachers report the highest levels of burnout regardless of their level of cumulative risk. Experienced teachers with the low cumulative risk scores report the lowest levels of burnout, and burnout increases with higher levels of cumulative risk, suggesting that burnout decreases with experience but not for teachers who experience more risk factors. Implications for research and education policy in low-income and conflict-affected countries are discussed.


Archive | 2017

A Framework for Human-Agent Social Systems: The Role of Non-technical Factors in Operation Success

Monika Lohani; Charlene K. Stokes; Natalia Dashan; Marissa McCoy; Christopher A. Bailey; Susan E. Rivers

We present a comprehensive framework that identifies a number of factors that impact human-agent team building, including human, agent, and environmental factors. This framework integrates existing empirical work in organization behavior, non-technical training, and human-agent interaction to support successful human-agent operations. We conclude by discussing implications and next steps to evaluate and expand our framework with the aim of guiding future attempts to create efficient human-agent teams and improve mission outcomes.


human robot interaction | 2016

Social Interaction Moderates Human-Robot Trust-Reliance Relationship and Improves Stress Coping

Monika Lohani; Charlene K. Stokes; Marissa McCoy; Christopher A. Bailey; Susan E. Rivers

Previous work with non-social human-robot interaction has found no links between trust and reliance [1]. The current study tested the question: Can social interactions moderate trust-reliance relationship? Human-robot interactions may share similar characteristics to social and emotional interactions between humans. We investigated how social and emotional human-robot interactions moderate the trust-reliance relationship and impacts perceived stress coping abilities. In the experimental condition, social and emotional interactions were used to guide the dialogue between a participant and a virtual robot in order to promote team building. In the matched control condition, the interactions were information-focused, without social or emotional interaction. We show that social interaction moderated the effect of trust on reliance such that higher trust led to greater reliance on the robot. The experimental condition also had higher perceived stress coping abilities. These findings contribute to the existing literature and suggest that creating deeper social and emotional interactions with a robot teammate can facilitate human-robot partnership.


Frontiers in Robotics and AI | 2017

Narratives with Robots: The Impact of Interaction Context and Individual Differences on Story Recall and Emotional Understanding

Iolanda Leite; Marissa McCoy; Monika Lohani; Daniel Ullman; Nicole Salomons; Charlene K. Stokes; Susan E. Rivers; Brian Scassellati

Role-play scenarios have been considered a successful learning space for children to develop their social and emotional abilities. In this paper, we investigate whether socially assistive robots in role-playing settings are as effective with small groups of children as they are with a single child, and whether individual factors such as gender, grade level (first vs. second), perception of the robots (peer vs. adult), and empathy level (low vs. high) play a role in these two interaction contexts. We conducted a three-week repeated exposure experiment where 40 children interacted with socially assistive robotic characters that acted out interactive stories around words that contribute to expanding childrens emotional vocabulary. Our results showed that although participants who interacted alone with the robots recalled the stories better than participants in the group condition, no significant differences were found in childrens emotional interpretation of the narratives. With regard to individual differences, we found that a single child setting appeared more appropriate to first graders than a group setting, empathy level is an important predictor for emotional understanding of the narratives, and childrens performance varies depending on their perception of the robots (peer vs. adult) in the two conditions.


robot and human interactive communication | 2016

Autonomous disengagement classification and repair in multiparty child-robot interaction

Iolanda Leite; Marissa McCoy; Monika Lohani; Nicole Salomons; Kara McElvaine; Charlene K. Stokes; Susan E. Rivers; Brian Scassellati

As research on robotic tutors increases, it becomes more relevant to understand whether and how robots will be able to keep students engaged over time. In this paper, we propose an algorithm to monitor engagement in small groups of children and trigger disengagement repair interventions when necessary. We implemented this algorithm in a scenario where two robot actors play out interactive narratives around emotional words and conducted a field study where 72 children interacted with the robots three times in one of the following conditions: control (no disengagement repair), targeted (interventions addressing the child with the highest disengagement level) and general (interventions addressing the whole group). Surprisingly, children in the control condition had higher narrative recall than in the two experimental conditions, but no significant differences were found in the emotional interpretation of the narratives. When comparing the two different types of disengagement repair strategies, participants who received targeted interventions had higher story recall and emotional understanding, and their valence after disengagement repair interventions increased over time.


intelligent virtual agents | 2017

Do We Need Emotionally Intelligent Artificial Agents? First Results of Human Perceptions of Emotional Intelligence in Humans Compared to Robots

Lisa Fan; Matthias Scheutz; Monika Lohani; Marissa McCoy; Charlene K. Stokes

Humans are very apt at reading emotional signals in other humans and even artificial agents, which raises the question of whether artificial agents need to be emotionally intelligent to ensure effective social interactions. For artificial agents without emotional intelligence might generate behavior that is misinterpreted, unexpected, and confusing to humans, violating human expectations and possibly causing emotional harm. Surprisingly, there is a dearth of investigations aimed at understanding the extent to which artificial agents need emotional intelligence for successful interactions. Here, we present the first study in the perception of emotional intelligence (EI) in robots vs. humans. The objective was to determine whether people viewed robots as more or less emotionally intelligent when exhibiting similar behaviors as humans, and to investigate which verbal and nonverbal communication methods were most crucial for human observational judgments. Study participants were shown a scene in which either a robot or a human behaved with either high or low empathy, and then they were asked to evaluate the agent’s emotional intelligence and trustworthiness. The results showed that participants could consistently distinguish the high EI condition from the low EI condition regardless of the variations in which communication methods were observed, and that whether the agent was a robot or human had no effect on the perception. We also found that relative to low EI high EI conditions led to greater trust in the agent, which implies that we must design robots to be emotionally intelligent if we wish for users to trust them.


human robot interaction | 2017

The Impact of Non-Technical skills on Trust and Stress

Monika Lohani; Charlene K. Stokes; Kevin Oden; Spencer J. Frazier; Kevin J. Landers; Patrick L. Craven; Durrell V. Lawton; Marissa McCoy; David J. Macannuco

We present a case study that examined the impact of infusing non-technical skills into the interactions between an expert analyst population and humanoid virtual robot teammates while executing a stressful real-world mission. In the experiment group, a virtual robot employed non-technical skills in addition to the technical skills only present in the control condition. We show that integration of these non-technical skills (rapport, cooperation, and collaboration) led to higher trustworthiness and trust in the virtual robot in the experiment group than the control group. Furthermore, the experiment group perceived lower threat-related appraisals than the control group. These findings contribute to the existing literature by suggesting that fostering non-technical skills with a virtual robot can serve as a bonding mechanism for human-robot teams to promote trustworthiness and trust, and reduce stress in high-stakes scenarios.


robot and human interactive communication | 2016

Perceived role of physiological sensors impacts trust and reliance on robots

Monika Lohani; Charlene K. Stokes; Marissa McCoy; Christopher A. Bailey; Aditi Joshi; Susan E. Rivers

Collaboration


Dive into the Marissa McCoy's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Charlene K. Stokes

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge