Vincent Mancuso
Massachusetts Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Vincent Mancuso.
European Journal of Work and Organizational Psychology | 2015
Susan Mohammed; Katherine Hamilton; Rachel Tesler; Vincent Mancuso; Michael D. McNeese
Although often ignored, establishing and maintaining congruence in team members’ temporal perceptions are consequential tasks that deserve research attention. Integrating research on team cognition and temporality, this study operationalized the notion of a temporal team mental model (TMM) at two points in time using two measurement methods. Ninety eight three-person teams participated in a computerized team simulation designed to mimic emergency crisis management situations in a distributed team environment. The results showed that temporal TMMs measured via concept maps and pairwise ratings each positively contributed uniquely to team performance beyond traditionally measured taskwork and teamwork content domains. In addition, temporal TMMs assessed later in teams’ development exerted stronger effects on team performance than those assessed earlier. The results provide support for the continued examination of temporal TMM similarity in future research.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2012
Vincent Mancuso; Michael D. McNeese
This paper describes an experiment to study the effects of varying knowledge structures on distributed team cognition. Using the teamNETS simulation, integrated and differentiated knowledge structures were manipulated by varying the reference materials the participants received during training. While the two knowledge structures had no direct effects on team performance, other results were found for their collaborative processes and team perceptions. Specifically the results showed that teams with differentiated structures worked more independently of each other, simply coordinated their actions and minimal communication, while teams with integrated structures worked more interdependently with a much tighter collaboration and frequent communication.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2010
Katherine Hamilton; Vincent Mancuso; Dev Minotra; Rachel Hoult; Susan Mohammed; Alissa Parr; Gaurav Dubey; Eric McMillan; Michael D. McNeese
This paper provides a detailed explanation of the link between NeoCITIES, a crisis management simulation of emergency response teams, and team cognition. Descriptions of the NeoCITIES simulation structure, interface, and modifications are provided, along with its functionality in effectively studying team cognition. The paper focuses on three commonly examined constructs within the team cognition literature, namely, team situation awareness, team mental models, and information sharing.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2016
Alex Z. Vieane; Gregory J. Funke; Robert S. Gutzwiller; Vincent Mancuso; Ben D. Sawyer; Christopher D. Wickens
Cyber security is a high-ranking national priority that is only likely to grow as we become more dependent on cyber systems. From a research perspective, currently available work often focuses solely on technological aspects of cyber, acknowledging the human in passing, if at all. In recent years, the Human Factors community has begun to address human-centered issues in cyber operations, but in comparison to technological communities, we have only begun to scratch the surface. Even with publications on cyber human factors gaining momentum, there still exists a major gap in the field between understanding of the domain and currently available research meant to address relevant issues. The purpose for this panel is to continue to expand the role of human factors in cyber research by introducing the community to current work being done, and to facilitate collaborations to drive future research. We have assembled a panel of scientists across multiple specializations in the human factors community to have an open discussion regarding how to leverage previous human factors research and current work in cyber operations to continue to push the bounds of the field.
EAI Endorsed Transactions on Security and Safety | 2013
Michael Tyworth; Nicklaus A. Giacobe; Vincent Mancuso; Michael D. McNeese; David L. Hall
In this paper we argue for a human-in-the-loop approach to the study of situation awareness in computer defence analysis (CDA). The cognitive phenomenon of situation awareness (SA) has received significant attention in cybersecurity/CDA research. Yet little of this work has attended to the cognitive aspects of situation awareness in the CDA context; instead, the human operator has been treated as an abstraction within the larger human-technology system. A more human-centric approach that seeks to understand the socio-cognitive work of human operators as they perform CDA will yield greater insights into the design of tools and interfaces for CDA. As support for this argument, we present our own work employing the Living Lab Framework through which we ground our experimental findings in contextual knowledge of real-world practice.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2016
Alex Z. Vieane; Gregory J. Funke; Vincent Mancuso; Eric T. Greenlee; Gregory Dye; Brett J. Borghetti; Brent Miller; Lauren Menke; Rebecca Brown
Cyber network analysts must gather evidence from multiple sources and ultimately decide whether or not suspicious activity represents a threat to network security. Information relevant to this task is usually presented in an uncoordinated fashion, meaning analysts must manually correlate data across multiple databases. The current experiment examined whether analyst performance efficiency would be improved by coordinated displays, i.e., displays that automatically link relevant information across databases. We found that coordinated displays nearly doubled performance efficiency, in contrast to the standard uncoordinated displays, and coordinated displays resulted in a modest increase in threat detections. These results demonstrate that the benefits of coordinated displays are significant enough to recommend their inclusion in future cyber defense software.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2014
Michael D. McNeese; Vincent Mancuso; Nathan J. McNeese; Tristan Endsley; Pete Forster
Teamwork has become one of the hallmarks of emergency crisis management (ECM). Success in managing emergency situations is highly dependent on teams working together to accomplish prioritized goals. Therefore, given the importance of teamwork, team cognition has been realized as an important component to address the emerging complexity, extreme workload, and uncertain conditions that can underlie emergency response. Many variables affect teams and their subsequent cognition. Understanding the effects of awareness, attention, temporality, common ground, team mental model development, and culture on team cognition provides insight into effective and efficient management of emergencies. As a research group, for more than a decade, we have studied team cognition within the context of ECM through the basis of simulations using the NeoCITIES platform. The purpose of this paper is to share our experiences using the NeoCITIES platform to conduct basic team cognitive research and share our visions for future research trajectories for the greater Human Factors community.
international conference on human-computer interaction | 2013
Gregory J. Funke; Benjamin A. Knott; Vincent Mancuso; Adam J. Strang; Justin Estepp; Lauren Menke; Rebecca Brown; Allen W. Dukes; Brent Miller
Assessment of mental workload is an important aspect of many human factors and HCI applications. Not surprisingly, a number of workload measures have been proposed. This study examined the sensitivity, convergent and concurrent validity of several subjective self-report and EEG workload measures. Most measures displayed adequate sensitivity to task difficulty manipulations, but relatively modest convergent and concurrent validity. Overall, we believe these result serve to aid human factors practitioners in selecting measures of workload for varied applications.
Proceedings of SPIE | 2013
Michael D. McNeese; Vincent Mancuso; Nathan J. McNeese; Tristan Endsley; Pete Forster
The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst’s needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011
Vincent Mancuso; Katherine Hamilton; Eric McMillan; Rachel Tesler; Susan Mohammed; Michael D. McNeese
In this paper we describe a methodology for utilizing team mental models as a basis for evaluating the usability and utility of collaborative systems. We present a case study of the evaluation of team mental models within the NeoCITIES 3.1 simulation. Paired comparison ratings, which are one of the most popularly used methods of team mental model assessment, were used to capture the team members’ taskwork-related knowledge, which was then compared to ratings from subject matter experts. These analyses were the driving force behind several design modifications in the NeoCITIES interface. We discuss the limitations of the method and its implications within the scope of collaborative systems evaluation and the field of HCI.