Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Diane Mitchell is active.

Publication


Featured researches published by Diane Mitchell.


Human Factors and Ergonomics Society Annual Meeting Proceedings | 2009

Workload Warriors: Lessons Learned from a Decade of Mental Workload Prediction Using Human Performance Modeling

Diane Mitchell; Charneta Samms

For at least a decade, researchers at the Army Research Laboratory (ARL) have predicted mental workload using human performance modeling (HPM) tools, primarily IMPRINT. During this timeframe their projects have matured from simple models of human behavior to complex analyses of the interactions of system design and human behavior. As part of this maturation process, the researchers learned: 1) to develop a modeling question that incorporates all aspects of workload, 2) to determine when workload is most likely to affect performance, 3) to build multiple models to represent experimental conditions, 4) to connect performance predictions to an overall mission or system capability, and 5) to format results in a clear, concise format. By implementing the techniques they developed from these lessons learned, the researchers have had an impact on major Army programs with their workload predictions. Specifically, they have successfully changed design requirements for future concept Army vehicles, substantiated manpower requirements for fielded Army vehicles, and made Soldier workload the number one item during preliminary design review for a major Army future concept vehicle program. The effective techniques the ARL researchers developed for their IMPRINT projects are applicable to other HPM tools. In addition, they can be used by students and researchers who are doing human performance modeling projects and are confronted with similar problems to help them achieve project success.


collaboration technologies and systems | 2008

Using modeling as a lens to focus testing

Diane Mitchell; Kaleb McDowell

Designers of collaborative unmanned systems assume that they have designed their systems to achieve the goals of reduced Soldier workload and higher level situation awareness (SA). Whether this assumption is valid, however, depends upon how soldiers interact with the systems as they accomplish their military missions. To evaluate their system design, designers of collaborative unmanned systems use field experiments in which soldiers interact with the systems. These experiments, however, are expensive and obtaining all the technologies and soldiers required to perform an entire military mission is challenging. Researchers at the Army Research Laboratory (ARL) and Tank Automotive Research, Development, and Evaluation Command (TARDEC) have established an approach that is effective in overcoming these restraints. To represent the complete military mission, they use human performance modeling. To evaluate the impact of interface specifications on the soldiers, they conduct experiments that incorporate issues identified by the mission modeling. ARL and TARDEC demonstrated the effectiveness of this approach when they used human performance modeling of future concept combat vehicles to focus a series of TARDEC autonomous vehicle experiments on critical soldier survivability issues. The experiments, in turn, demonstrated a way of mitigating some of the soldier performance issues related to unmanned asset operations identified by the human performance modeling.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

Both Sides of the Coin Technique for Integrating Human Factors and Systems Engineering in System Development

Diane Mitchell; Kevin Agan; Charneta Samms

Army analysts have developed an integrated technique for optimum system design early in the system development cycle called the Human Availability Technique (HAT). HAT combines a systems engineering technique, the System Capabilities Analysis Process (SCAP) with a Human Factors Engineering technique, Cognitive Building Block (CBB) Approach. Analysts implement HAT by creating human performance models that include both common Soldier functions from CBB and system capabilities from SCAP. To demonstrate the benefits of this approach for a complex system, analysts completed a proof-of-concept analysis with HAT for a conceptual 2-Soldier design utility vehicle. This case study demonstrated the benefits of including a detailed representation of both Soldier and system functions in one analysis.


international conference on engineering psychology and cognitive ergonomics | 2009

Harnessing the Power of Multiple Tools to Predict and Mitigate Mental Overload

Charneta Samms; David Jones; Kelly S. Hale; Diane Mitchell

Predicting the effect of system design decisions on operator performance is challenging, particularly when a system is in the early stages of development. Tools such as the Improved Performance Research Integration Tool (IMPRINT) have been used successfully to predict operator performance by identifying task/design combinations leading to potential mental overload. Another human performance modeling tool, the Multimodal Interface Design Support (MIDS) tool, allows system designers to input their system specifications into the tool to identify points of mental overload and provide multi-modal design guidelines that could help mitigate the overload identified. The complementary nature of the two tools was recognized by Army Research Laboratory (ARL) analysts. The ability of IMPRINT to stochastically identify task combinations leading to overload combined with the power of MIDS to address overload conditions with workload mitigation strategies led to ARL sponsorship of a proof of concept integration between the two tools. This paper aims to demonstrate the utility of performing low-cost prototyping to combine associated technologies to amplify the utility of both systems. The added capabilities of the integrated IMPRINT/MIDS system are presented with future development plans for the system.


international conference of design, user experience, and usability | 2014

Designing the User Experience for C4ISR Systems in the U.S. Army

Pamela Savage-Knepshield; Jeffrey Thomas; Christopher Paulillo; James Davis; Diane Quarles; Diane Mitchell

A unique set of challenges exist for implementing user-centered design principles in the context of military acquisition over and above those typically encountered by user experience designers. This paper focuses on the tools and techniques that we have utilized to help ensure that a positive user experience (UX) will result when Soldiers and systems interact under harsh conditions on the battlefield. Insights gained from applying these techniques to system design and evaluation early in the acquisition process and the impact that their use has had on training and system design are discussed.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2014

The U.S. Army Acquisition Process and Close MANPRINT Encounters of the "Early" & "Agile" Kind

Pamela Savage-Knepshield; Christopher Paulillo; Jeffrey Thomas; James Davis; Diane Quarles; Napoleon Gaither; Diane Mitchell

Fiscal reform initiatives and a desire to increase human-system performance and effectiveness at the user interface have not only created challenges for ARL HRED human factors/ergonomics (HF/E) practitioners, but have also created opportunities to demonstrate the impact and return-on-investment (ROI) that earlier engagement brings to the acquisition process. A panel of multidisciplinary HF/E practitioners discusses the tools, techniques, and environments in which they have worked when engaging in “early” acquisition activities prior to Milestone (MS) B, which marks the formal start of an Army system’s acquisition program. They will take a retrospective look at what has worked well and what has not when influencing design decisions post-MS B and how these lessons learned may be applied prior to MS B. They will also share insights gleaned from a program that was initiated to apply HF/E early in the deliberate acquisition process that is currently in its second year of a 2-year program as well as their involvement in the Agile process. We seek audience participation as we discuss the extent to which we have been successful in meeting our objectives and how we can best frame what we have learned in terms of ROI. Our panelists, who have previous experience in industry and military services other than the Army, draw from their expertise in HF/E and varied experience with a wide range of commodity-oriented systems. Panel discussions will reveal critical insights that are generalizable across a wide range of industry sectors as well as those that warrant future implementation and further investigation.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

The “Right Stuff” for Testing, Evaluating, and Analyzing Human Performance

Pamela Savage-Knepshield; Sage Jessee; Richard Kozycki; Michael LaFiandra; Diane Mitchell; Angelique Scharine

A great deal of military equipment is ill-fitting, uncomfortable, and difficult to learn to use, operate, and maintain. For example, Soldiers report difficulty during vehicle egress and ingress, as well as pain, fatigue and discomfort from vehicle vibration, seats, restraint systems, and the sheer weight of their equipment load. They also report that mission critical systems are located outside their reach and field of view. Clearly, systems exhibiting these characteristics have not benefited from the incorporation of human factors/ergonomics or human system integration during their design and development. The panel, which is comprised of multidisciplinary practitioners and researchers from the Army Research Laboratory, discusses tools, techniques, and test environments that they have used to study the effects of military equipment on Soldier performance. The overall goal of this panel is to highlight a variety of methods and facilities that they have found useful and effective early in the design process before software is coded and metal bent. These panelists have had previous experience in industry, academia, and working with military services other than the Army, and will draw from their varied areas of expertise in cognition and biomechanics to reveal critical factors and insights that are generalizable across a wide range of products and industry sectors as well as those that warrant further investigation.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2007

Please Don't Abuse the Models: The Use of Experimental Design in Model Building

Diane Mitchell; Charneta Samms

Human factors professionals strive to create effective system design by ensuring that the capabilities and limitations of the human operator are a primary design consideration. By demonstrating the effect of various design decisions concerning the human operator on system performance early in the development cycle, resulting recommendations may be adopted to enhance system performance. To achieve this goal, human performance modeling tools are used. One of the common misconceptions about modeling is that a model alone will impact design. A model is merely a building block within an analysis. An influential analysis incorporates the principles of experimental design, the power of predictive modeling and the varying system design requirements to produce results that empower program managers to make informed decisions regarding system design. Analysts at the U.S. Army Research Laboratory have successfully influenced the conceptual design of several manned ground vehicles by recognizing this difference.


Archive | 2003

Trade Study: A Two- Versus Three-Soldier Crew for the Mounted Combat System (MCS) and Other Future Combat System Platforms

Diane Mitchell; Charneta Samms; Thomas J. Henthorn; Josephine Wojciechowski


Archive | 2012

Predicting the Impacts of Intravehicular Displays on Driving Performance with Human Performance Modeling

Diane Mitchell; Josephine Wojciechowski; Charneta Samms

Collaboration


Dive into the Diane Mitchell's collaboration.

Top Co-Authors

Avatar

David Jones

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Kelly S. Hale

University of Central Florida

View shared research outputs
Researchain Logo
Decentralizing Knowledge