Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jennifer Fowlkes is active.

Publication


Featured researches published by Jennifer Fowlkes.


Cognition, Technology & Work | 2009

Using cognitive task analysis to explore issues in the procurement of intelligent decision support systems

Robert R. Hoffman; Kelly Neville; Jennifer Fowlkes

Government statistics and various news reports suggest that upwards of half of all large-scale information technology (IT) development projects fail to meet expectations for facilitating cognitive work. Many of the failures point to the neglect of human-centering considerations during the development of sociotechnical systems. The groups of people who create IT themselves constitute a sociotechnical system. Therefore, laws of cognitive work apply to the cognitive work of IT development, and these laws include the “reductive tendency” for people to form simplified understandings when confronted with domains of dynamics and complexity. In this article, we report a study in which we “turned the tables” on IT systems development. Rather than using cognitive task analysis to study some work domain for which an envisioned IT system would be developed, we used cognitive task analysis to study the work domain of IT systems development itself. Through documentation analysis and critical decision method procedures, we sought to reveal specific challenges with regard to human-centering, and ways in which principles, methods, and tools of ergonomics (human factors, cognitive systems engineering) might help the developers of IT systems better address the human and social aspects of cognitive work. The findings highlight the outstanding challenges and barriers to the procurement and development of usable, useful, and understandable IT for sociotechnical systems. Challenges include the following: the need for better coordination mechanisms; the need to locate cognitive systems engineers, as advocates for workers, in key leadership roles; the need to reconceive concepts and methods of requirements and requirements specification; and the need for better negotiation of the trade-offs of cost/schedule considerations with human-centering considerations.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2006

Evaluating the Effectiveness of a Joint Cognitive System: Metrics, Techniques, and Frameworks

Scott S. Potter; David D. Woods; Emilie M. Roth; Jennifer Fowlkes; Robert R. Hoffman

An implication of Cognitive Systems Engineering is that joint cognitive systems (JCS; also known as complex socio-technical systems) need to be evaluated for its effectiveness in performing the complex cognitive work requirements. This requires using measures that go well beyond “typical” performance metrics such as the number of subtask goals achieved per person per unit of time and the corresponding simple baseline comparisons or workload assessment metrics. This JCS perspective implies that the system must be designed and evaluated from the perspective of the shift in role of the human supervisor. This imposes new types of requirements on the human operator. Previous research in CSE and our own experience has lead us to identify a set of generic JCS support requirements that apply to cognitive work by any cognitive agent or any set of cognitive agents, including teams of people and machine agents. Metrics will have to reflect such phenomena as “teamwork” or “resilience” of a JCS. This places new burdens on evaluation techniques and frameworks, since metrics should be generated from a principled approach and based on fundamental principles of interest to the designers of the JCS. An implication of the JCS perspective is that complex and cognitive systems need to be evaluated for usability, usefulness, and understandability; each of which goes well beyond raw performance. However, conceptually-grounded evaluation frameworks, corresponding operational techniques, and corresponding measures for these are limited. Therefore, in order to advance the state of the field, we have gathered a set of researchers and practitioners to present recent evaluation work to stimulate discussion.


Theoretical Issues in Ergonomics Science | 2009

Challenges to the development of pedagogically driven engineering requirements for complex training systems

Jennifer Fowlkes; Kelly Neville; Jerry M. Owens; Amanda Hafich

Complex operating environments are becoming prevalent within industry and the military. Concomitantly, the development of advance technologies is increasing, enabling the simulation of complex environments and presenting exciting training opportunities. What is lacking is strong guidance from training science for how to focus and enhance the use of simulation technologies for training system design. The focus of this paper is to continue the discussion of this long-standing problem by focusing on three causes: the lack of integration of training research, training practice and technology development; the difficulty of translating research findings into useful design artefacts; and fundamental, culturally based differences between the stakeholders involved in training system development, including programme managers, engineers, psychologists and domain experts. The causes are addressed both to raise awareness and to begin envisioning solutions.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2006

Multidisciplinary Systems Development: Challenges and Lessons Learned

Kelly Neville; Jennifer Fowlkes; Jerry M. Owens; Jack Ennis

Systems developed by multidisciplinary teams should benefit from the varied backgrounds and diverse contributions of their developers. However, multidisciplinary systems development teams face challenges that can impair team effectiveness and limit the contributions of participating disciplines. These challenges may be major contributing factors to the long-term struggle of the cognitive and human factors engineering disciplines to contribute meaningfully to systems development. To gain insight into these challenges, we reviewed four of our own recent multidisciplinary systems development projects. In each project, cognitive engineers participated with systems and software engineers across the entire development effort. By describing challenges we faced, this paper is intended to draw attention to types of change that may facilitate the participation of multiple disciplines, and of cognitive and human factors engineering in particular, in future systems development efforts.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2005

Constraint-Directed Performance Measurement for Large Tactical Teams

Jennifer Fowlkes; Jerry M. Owens; Corbin Hughes; Joan H. Johnston; Michael Stiso; Amanda Hafich; Kevin Bracken

Large tactical teams must demonstrate integrative performance as tens to thousands of operators perform within highly dynamic, complex, and unpredictable environments. The development of methods for capturing integrated performance and the achievement of team goals, while also allowing for and even embracing adaptive performance, is challenging. However, as Distributed Mission Training (DMT) systems continue to mature and are increasingly representative of important training opportunities in the military, diagnostic performance assessment systems are needed to ensure training quality. In this paper, we propose a methodological framework for team performance that is responsive to the performance measurement challenges found within DMT systems. The approach is illustrated within a U.S. Navy research and development program called Debriefing Distributed Simulation-Based Exercises (DDSBE).


AIAA Modeling and Simulation Technologies Conference and Exhibit | 2004

Instructor Operator Functions for Training Distributed Teams: Instruction, Collaboration, and Communication

Jennifer Fowlkes; Jerry M. Owens; Michael Stiso; Amanda Hafich; Susan M. Eitelman; Melissa M. Walwanis Nelson; David G. Smith

#** Distributed Mission Training (DMT) is needed to provide naval aircrews the opportunity to interact and train together in a simulated environment that more closely approximates the de mands of real -world naval aviation combat. However, DMT scenarios will be larger and more complex than predecessor scenarios because of the number of entities (both real and artificial) and the multitude of information transactions among platforms, entitie s, and agencies necessary to execute missions. Instructor personnel once challenged in a single platform training environment will find added dimensionality and difficulty in their new roles that will impose increased demands for monitoring, assessing, and managing training events in a dynamic, highly interactive, multi -platform environment. A major challenge in the design of such systems will be to provide instructors the support they need to carry out their training responsibilities efficiently and effect ively across all phases of simulation -based training. The purpose of this paper is to describe an integrated design framework for a Common Distributed Mission Training System (C -DMTS) that supports instructors in the conduct of distributed exercises by emp hasizing three functions: instruction, to support the training value of the distributed training system; collaboration, to enhance the coordination that must occur among a distributed team of instructors and trainees; and communication, to provide required communication functionality as seamlessly as possible.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2003

A Cognitive Task Analysis of Coordination in a Distributed Tactical Team: Implications for Expertise Acquisition

Kelly Neville; Jennifer Fowlkes; Melissa M. Walwanis Nelson; Maureen L. Bergondy-Wilhelm

A Cognitive Task Analysis (CTA) was conducted to examine distributed team coordination expertise. Knowledge associated with air wing strike team coordination challenges was elicited from E-2C Naval Flight Officers (NFOs) with varying levels of experience, and was assessed using a methodology consisting of multiple qualitative assessment techniques. Results include: (1) insight into the organizational structure of NFO team coordination knowledge; (2) rich representations of NFO team coordination knowledge that may be used to populate training and performance support tools; (3) experience-related differences in the use of knowledge and skill to support team coordination; and (4) knowledge and skill categories that support team coordination. Each of these results may contribute to the design of performance support tools and training guidelines, strategies, and content that enhances team coordination.


IEEE Intelligent Systems | 2007

Human Total Cost of Ownership: The Penny Foolish Principle at Work

Wayne Zachary; Kelly Neville; Jennifer Fowlkes; Robert R. Hoffman


IEEE Intelligent Systems | 2008

The Procurement Woes Revisited

Kelly Neville; Robert R. Hoffman; Charlotte Linde; William C. Elm; Jennifer Fowlkes


Software Engineering Research and Practice | 2007

The Problem of Designing Complex Systems.

Jennifer Fowlkes; Kelly Neville; Robert R. Hoffman; Wayne Zachary

Collaboration


Dive into the Jennifer Fowlkes's collaboration.

Top Co-Authors

Avatar

Robert R. Hoffman

Florida Institute for Human and Machine Cognition

View shared research outputs
Top Co-Authors

Avatar

Beth F. Wheeler Atkinson

Naval Air Warfare Center Training Systems Division

View shared research outputs
Top Co-Authors

Avatar

David D. Woods

Veterans Health Administration

View shared research outputs
Top Co-Authors

Avatar

Emilie M. Roth

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Melissa M. Walwanis Nelson

Naval Air Warfare Center Training Systems Division

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge