Eduardo Salas
Old Dominion University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Eduardo Salas.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1990
Donald L. Lassiter; Jeremy S. Vaughn; Virginia E. Smaltz; Ben B. Morgan; Eduardo Salas
The purpose of this research was to examine the effects of different types of training interventions on team communication. Forty-five teams of two persons each viewed one of three training videotapes (control, knowledge, and skills), then performed a low-fidelity helicopter simulation exercise. Trained raters were used to rate teams in terms of levels of their communications and mission performances. Finally, attitudes concerning aircrew coordination were measured before and after training using the Cockpit Management Attitude Questionnaire (CMAQ) developed by Helmreich, Wilhelm, & Gregorich (1988). Results indicated that team communication skills were affected by the type of training intervention the team received. Specifically, the communication of teams in the skills group was significantly better than that of the other two groups. No significant effect of training intervention was found for the mission performance or the attitude data, although a significant correlation was found between team communication performance and team mission performance. These results are discussed in terms of their relevance to the design of training interventions for enhancing team training and performance.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1993
Renée J. Stout; Eduardo Salas
Critical decisions are made every day by teams of individuals who must coordinate their activities to achieve effectiveness. Recently, researchers have suggested that shared mental models among team members may help them to make successful decisions. Several avenues for training shared mental models in teams exist, one of which is training in planning behaviors. The relationship between team planning, team shared mental models, and coordinated team decision making and performance is explored.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1989
Renée J. Stout; Jan Cannon-Bowers; Ben B. Morgan; Eduardo Salas
Operational studies have revealed a need to focus attention on team training, and a need for effective teamwork skills for successful training performance. The present study was designed to develop an assessment scale that can be used by instructors of various training situations, which will yield a measure of the degree of teamwork required in their situations. Data obtained from the scale show psychometrically sound properties of the scale (high internal consistency and high item-total correlations) and initial validity of it (the ability to distinguish various training situations as to the extent of teamwork that is required). Recommendations for future research are also discussed.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1987
Eduardo Salas; Ben B. Morgan; Albert S. Glickman
Several models of team development were synthesized from the team performance/team training literature as the basis for a working model of Team Evolution And Maturation (TEAM). The TEAM methodology is designed to investigate the development of teamwork during training of operational teams. The TEAM model suggests that the life cycle of a team consists of as many as seven developmental stages. The theoretical foundations and description of the model are discussed as well as its relevance to team training.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1999
Maureen L. Bergondy; Eduardo Salas
Several important questions and challenges face the cognitive and training research communities to develop effective strategies for training large distributed teams. 1. Development of a conceptual framework or model of distributed team performance. Are there unique features of distributed teams? How do we represent the behavioral and cognitive aspects of these complex entities? 2. Challenges in developing effective strategies for large distributed teams. How to structure training opportunities to maximize the development of team skills/knowledge? 3. How to construct an effective distributed training environment? 4. How we approach performance measurement and feedback in distributed environment? What are the methodological, logistical, and technological challenges? The panels objective is to raise important issues about distributed team training and to discuss what is new? what is different? and what we already know? We will discuss the challenges faced developing conceptual frameworks, team models, training strategies, training environments, and performance measurement for distributed teams.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1984
Elizabeth W. Pitts; Eduardo Salas; Michele Terranova; Gary L. Allen; Ben B. Morgan
The Armed Forces and other large organizations often use scores from standardized aptitude batteries as indicators of cognitive aptitude. However, aptitude may also be demonstrated by the learning that occurs during training and measured by parameters such as initial ability levels and time needed to acquire information or skills. By using computer administered Complex Experimental Learning Tasks (CELTs), learning rate parameters recently have proved to be pragmatic as well as theoretical indicators of final performance. Potential advantages of this approach include ease and economy of computer administration, testee acceptance of job relevant tests, and potential benefits of shortened training schedules. The current research compared rate measures derived from learning on four CELTs with a paper-and-pencil battery designed to include static aptitude measures of the same domains. Performance measures were computed from stimulus display times, subject response time, and item accuracy. Overall final performance was computed using the average of the last five minutes. Correlational analyses and regression indicate that, with some qualifications, learning rate measures are predictors of individual and overall levels of performance on each CELT. Implications of these findings are that the current practice of using static aptitude tests for selection to training programs may not provide the most accurate picture of an individuals potential success or failure in that program, and that, given the trend towrds new computer-assisted training technologies, individuals may be selected on the basis of their potential for rapid learning, thus making use of the least expensive and most efficient training methods possible.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1983
Robert J. Jones; Eduardo Salas; Elizabeth W. Pitts; Gary L. Allen; Ben B. Morgan
The increasing technological sophistication of organizations and the concomitant requirement for individuals to process large quantities of information has raised questions concerning the efficacy of traditional approaches to performance assessment within the personnel subsystem. Traditional approaches to aptitude testing have focused on relating psychometric profiles to subsequent job performance. These approaches are subject to criticism in that they use static measures (test scores) to predict the dynamic processes of learning and are heavily influenced by prior experience without reflecting the ability to acquire new information. An alternative to traditional psychometric approaches to personnel selection is the use of rate parameters (reflecting change in performance over time) derived from relatively complex cognitive tasks as predictors of training success. The use of microprocessor-administered tasks to assess cognitive skills and abilities is an integral part of this alternative approach. The use of computer-administered Complex Experimental Learning Tasks (CELTS) for the assessment of learning abilities illustrates the computerized, rate-based approach to performance assessment. The purpose of this paper is to describe an initial study to determine the predictability of learning rate measures for performance assessment and to suggest needed future developments in training as well as job performance research.
Archive | 1989
Randall L. Oser; G. A. McCallum; Eduardo Salas; Ben B. Morgan
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1998
Renée J. Stout; Eduardo Salas; Gary Klein; Marvin S. Cohen; Judith Orasanu
Archive | 1992
Renée J. Stout; Carolyn Prince; David P. Baker; Maureen L. Bergondy; Eduardo Salas