Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Emilie M. Roth is active.

Publication


Featured researches published by Emilie M. Roth.


Studies in Computer Science and Artificial Intelligence | 1989

Cognitive Task Analysis: An Approach to Knowledge Acquisition for Intelligent System Design

Emilie M. Roth; David D. Woods

Publisher Summary This chapter discusses some of the common pitfalls that arise in building intelligent support systems and describe a pragmatic knowledge acquisition approach for defining and building effective intelligent support systems. The cognitive task analysis provides an umbrella structure of domain semantics that organizes and makes explicit what particular pieces of knowledge mean about problem-solving in the domain. Acquiring and using such a domain semantics is essential (l) to specify what kinds of cognitive support functions are needed, (2) to specify what kinds of computational mechanisms are capable of providing such functions, (3) to clearly delineate machine performance boundaries, and (4) to build less brittle machine problem-solvers, for example, through features that enable the human problem-solver to extend and adapt the capability of the system to handle unanticipated situations. This is in contrast to technology-driven approaches where knowledge acquisition focuses on describing domain knowledge in terms of the syntax of particular computational mechanisms. In other words, the language of implementation is used as a substitute for a cognitive language of description. The cognitive task analysis approach redefines the knowledge acquisition problem: knowledge acquisition, first, is about deciding what kinds of intelligent systems would make a difference and, second, about what domain specific knowledge is needed to fuel those systems.


Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting | 2001

USING COGNITIVE TASK ANALYSIS (CTA) TO SEED DESIGN CONCEPTS FOR INTELLIGENCE ANALYSTS UNDER DATA OVERLOAD

Emily S. Patterson; David D. Woods; David Tinapple; Emilie M. Roth

This paper describes how a Cognitive Task Analysis (CTA) was used to jumpstart the exploration of useful design aids to combat data overload in intelligence analysis. During a simulated analysis task, we observed how professional intelligence analysts were vulnerable to making inaccurate statements when they were under time pressure and working in a topic area outside their area of expertise. From these observations, we generated design recommendations and criteria for evaluating the usefulness of any effort aimed at reducing data overload. Then, we used CTA insights to trigger the development of modular design concepts, or “design seeds,” that leverage advances in machine processing to address vulnerabilities. Nine design seeds were integrated into a “Visual Narratives” workspace visualization concept. Feedback about the usefulness of the design seeds was obtained during an elicitation session following an animated fly-through, or “Ani-mock,” demonstration.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2002

Scenario Development for Decision Support System Evaluation

Emilie M. Roth; James W. Gualtieri; William C. Elm; Scott S. Potter

This paper introduces a methodology for developing scenarios representative of the cognitive and collaborative challenges inherent in a domain of practice for evaluating Decision Support Systems (DSS). Explicit links are made between particular aspects of the DSS and specific cognitive and collaborative demands they are intended to support. The effectiveness of the DSS in supporting performance can then be systematically probed by creating scenarios that are informed by an understanding of individual and team cognitive processing factors, fundamental relationships within the domain, and known complicating factors that can arise in the domain to challenge cognitive and collaborative performance. This paper introduces a set of explicit artifacts to systematically create such scenarios to provide feedback on the viability of the DSS design concepts (e.g., are the hypothesized positive impacts of the DSS realized?), as well as feedback on additional unanticipated requirements for support.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1998

A Framework for Integrating Cognitive Task Analysis into the System Development Process

Scott S. Potter; Emilie M. Roth; David D. Woods; William C. Elm

This paper describes a process that orchestrates different types of specific CTA techniques to provide design relevant CTA results and integrates CTA results into the software development process. Two fundamental premises underlie the approach. First, CTA is more than the application of any single CTA technique. Instead, developing a meaningful understanding of a field of practice relies on multiple converging techniques in a bootstrapping process. The important issue from a CTA perspective is to evolve a model of the interconnections between the demands of the domain, the strategies and knowledge of practitioners, the cooperative interactions across human and machine agents, and how artifacts shape these strategies and coordinative activities across a series of different specific techniques. Second, since CTA is a means to support the design of computer-based artifacts that enhance human and team performance, CTA must be integrated into the software and system development process. Thus, the vision of CTA as an initial, self-contained technique that is handed-off to system designers is reconceived as an incremental process of uncovering the cognitive demands imposed on the operator(s) by the complexities and constraints of the domain.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2006

Evaluating the Effectiveness of a Joint Cognitive System: Metrics, Techniques, and Frameworks

Scott S. Potter; David D. Woods; Emilie M. Roth; Jennifer Fowlkes; Robert R. Hoffman

An implication of Cognitive Systems Engineering is that joint cognitive systems (JCS; also known as complex socio-technical systems) need to be evaluated for its effectiveness in performing the complex cognitive work requirements. This requires using measures that go well beyond “typical” performance metrics such as the number of subtask goals achieved per person per unit of time and the corresponding simple baseline comparisons or workload assessment metrics. This JCS perspective implies that the system must be designed and evaluated from the perspective of the shift in role of the human supervisor. This imposes new types of requirements on the human operator. Previous research in CSE and our own experience has lead us to identify a set of generic JCS support requirements that apply to cognitive work by any cognitive agent or any set of cognitive agents, including teams of people and machine agents. Metrics will have to reflect such phenomena as “teamwork” or “resilience” of a JCS. This places new burdens on evaluation techniques and frameworks, since metrics should be generated from a principled approach and based on fundamental principles of interest to the designers of the JCS. An implication of the JCS perspective is that complex and cognitive systems need to be evaluated for usability, usefulness, and understandability; each of which goes well beyond raw performance. However, conceptually-grounded evaluation frameworks, corresponding operational techniques, and corresponding measures for these are limited. Therefore, in order to advance the state of the field, we have gathered a set of researchers and practitioners to present recent evaluation work to stimulate discussion.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2001

Analysis with a Purpose: Narrowing the Gap with a Pragmatic Approach

James W. Gualtieri; William C. Elm; Scott S. Potter; Emilie M. Roth

There has been a growing interest in using Cognitive Systems Engineering (CSE) techniques to understand work domains and the cognitive demands they imposes on practitioners in order to provide a foundation for the design of decision-aids. While CSE techniques, like Cognitive Work Analysis (CWA), have been proven successful in illuminating the sources of cognitive complexity and explicating the basis of expertise, there is often still a gap between the results of the CWA and the resulting design and development of the decision support system. One way to narrow the gap is to develop an integrated set of artifacts that provide explicit links between (1) the functional goals the domain, to (2) the cognitive demands that require support, through (3) the mapping of decisions to the display space. In this paper a brief discussion of a recent example where this approach was taken is presented.


Archive | 1999

Aiding the Intelligence Analyst in Situations of Data Overload: A Simulation Study of Computer-Supported Inferential Analysis under Data Overload

Emily S. Patterson; Emilie M. Roth; David D. Woods


Expertise and technology | 1995

Symbolic AI computer simulations as tools for investigating the dynamics of joint cognitive systems

David D. Woods; Emilie M. Roth


Archive | 2001

Aiding the Intelligence Analyst: From Problem Definition to Design Concept Exploration

Emily S. Patterson; David D. Woods; David Tinapple; Emilie M. Roth; John M. Finley; Gilbert G. Kuperman


Archive | 1990

Vorrichtung zur Behandlung der Anzeige von Anleitungen eines Expertensystems

William C. Elm; Emilie M. Roth; David D. Woods

Collaboration


Dive into the Emilie M. Roth's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert R. Hoffman

University of West Florida

View shared research outputs
Researchain Logo
Decentralizing Knowledge