Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Leah Reeves is active.

Publication


Featured researches published by Leah Reeves.


Communications of The ACM | 2004

Guidelines for multimodal user interface design

Leah Reeves; Jennifer Lai; James A. Larson; Sharon L. Oviatt; T. S. Balaji; Stéphanie Buisine; Penny Collings; Philip R. Cohen; Ben J. Kraal; Jean-Claude Martin; Michael F. McTear; Thiru Vilwamalai Raman; Kay M. Stanney; Hui Su; Qian Ying Wang

JMUI (Journal on Multimodal User Interfaces), Special issue “Best of affective computing and intelligent Guidelines for multimodal user interface design. support, human multi-modal information processing. characteristics to the design of a user-oriented and guidelines of multimodal interface design. Artifact lifecycle management, Consumer and user, Interfaces in Automated.Aug 2 Aug 7Los Angeles, CA, USAThursday, 6 August 2015 / HCI International 20152015.hci.international/thursday​CachedDefining and Optimizing User Interfaces Information Complexity for AI Design and Development of Multimodal Applications: A Vision on Key Issues and Traditional Heuristics and Industry Guidelines to Evaluate Multimodal Digital Artifacts


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 2003

Usability engineering of virtual environments (VEs): identifying multiple criteria that drive effective VE system design

Kay M. Stanney; Mansooreh Mollaghasemi; Leah Reeves; Robert Breaux; David Graeber

Designing usable and effective interactive virtual environment (VE) systems is a new challenge for system developers and human factors specialists. In particular, traditional usability principles do not consider characteristics unique to VE systems, such as the design of wayfinding and navigational techniques, object selection and manipulation, as well as integration of visual, auditory and haptic system outputs. VE designers must enhance presence, immersion, and system comfort, while minimizing sickness and deleterious after effects. Through the development of a multi-criteria assessment technique, the current effort categorizes and integrates these VE attributes into a systematic approach to designing and evaluating VE usability. Validation exercises suggest this technique, the Multicriteria Assessment of Usability for Virtual Environments (MAUVE) system, provides a structured approach for achieving usability in VE system design and evaluation. Applications for this research include military, entertainment, and any other interactive system that seeks to provide an enjoyable and effective user experience.


International Journal of Human-computer Interaction | 2004

A Paradigm Shift in Interactive Computing: Deriving Multimodal Design Principles from Behavioral and Neurological Foundations

Kay M. Stanney; Shatha N. Samman; Leah Reeves; Kelly S. Hale; Wendi L. Buff; Clint A. Bowers; Brian Goldiez; Denise Nicholson; Stephanie J. Lackey

As technology advances, systems are increasingly able to provide more information than a human operator can process accurately. Thus, a challenge for designers is to create interfaces that allow operators to process the optimal amount of data. It is herein proposed that this may be accomplished by creating multimodal display systems that augment or switch modalities to maximize user information processing. Such a system would ultimately be informed by a users neurophysiological state. As a first step toward that goal, relevant literature is reviewed and a set of preliminary design guidelines for multimodal information systems is suggested.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2005

Validation of Predictive Workload Component of the Multimodal Information Design Support (Mids) System

Kelly S. Hale; Leah Reeves; Par Axelsson; Kay Stanney

Operators in military C4ISR environments are required to rapidly assess and respond to critical events accurately while monitoring ongoing operations. In order to assist in designing complex display systems to support C4ISR operators, it is critical to understand when and why information displayed exceeds human capacity. Common metrics for evaluating operator overload are subjective report, which rely on self-reporting techniques (e.g., NASA/TLX, SART). A new design tool, the Multimodal Information Design Support (MIDS) system, predicts times of operator overload and offers multimodal design guidelines to streamline cognitive processing, thus alleviating times of operator workload and optimizing situation awareness. This paper empirically validates MIDS” predictive power in determining situations which may cause operator overload by comparing MIDS output to subjective reports of workload and SA during C4ISR operations. Future studies will validate MIDS” design capabilities through redesign and evaluation of performance, workload and SA on the optimized C4ISR task environment.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2003

Multimodal, Multitask Interaction Design: Challenging Long-Standing Unimodal Design Assumptions

Kelly S. Hale; Shatha N. Samman; Wendi L. Buff; Kay Stanney; Leah Reeves; Clint A. Bowers; Glenn A. Martin

As the technology that supports interactive systems advances, the possibility of leveraging a multitude of sensory systems becomes possible. By using multiple sensory processors, substantial gains in the information management capacity of the human-computer integral should be realized, and those with sensory losses can be better accommodated. The question becomes when multimodal information is presented, how should these multiple sources of information be coordinated, particularly when two or more tasks are performed simultaneously? While current design theories developed primarily for unimodal interaction can be drawn on, additional research is required to fully guide multimodal multitask interaction design. The current study seeks to extend unimodal design theories to multimodal systems and identifies some interesting differences in unimodal vs multimodal multitask interaction.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2008

New Kids on the Block: Multi-Dimensional Perspectives on Augmented Cognition

Julie M. Drexler; Leah Reeves; Dylan Schmorrow; Dennis McBride; Kay Stanney; Chris Berka; Blair Dickson

This discussion panel was organized to offer HFES members an opportunity to learn more about the burgeoning field of Augmented Cognition and to discover the multi-dimensional aspects of the discipline. The session will feature six invited panelists who were selected to represent a cross-section of the Augmented Cognition International Society community of more than 900 members. Each panelist will present their unique perspective of the AugCog field, which will provide the audience with information on a variety of research, development, and application areas in the AugCog field within the U.S and abroad. The panel members and their associated AugCog perspectives include: CDR Dylan Schmorrow, government; Denise Nicholson, academia; Dennis McBride, non-profit; Kay Stanney and Chris Berka, industry; and Blair Dickson, industry/international.


international conference on human-computer interaction | 2005

EEG Indices Distinguish Spatial and Verbal Working Memory Processing: Implications for Real-Time Monitoring in a Closed-Loop Tactical Tomahawk Weapons Simulation

Chris Berka; Daniel J. Levendowski; Gene Davis; Michelle N. Lumicao; Caitlin K. Ramsey; Kay M. Stanney; Leah Reeves; Patrice D. Tremoulet; Susan Harkness Regli


Archive | 2007

Augmenting Cognition in Hci: 21St Century Adaptive System Science And Technology

Amy E. Bolton; Amy A. Kruse; Dylan Schmorrow; Leah Reeves


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1995

An Innovative Approach to Usability Testing: Facilitated Free-Play

Kay M. Stanney; Leah Reeves; David Dryer


Archive | 2008

Modeling and Augmenting Cognition: Supporting Optimal Individual Warfighter Human Performance

Kelly Rossi; Leah Reeves; Karl Van Orden; Peter Muller; Dylan Schmorrow; David A. Kobus

Collaboration


Dive into the Leah Reeves's collaboration.

Top Co-Authors

Avatar

Kay M. Stanney

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Berka

University of California

View shared research outputs
Top Co-Authors

Avatar

Kelly S. Hale

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar

Clint A. Bowers

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patrice D. Tremoulet

Lockheed Martin Advanced Technology Laboratories

View shared research outputs
Top Co-Authors

Avatar

Susan Harkness Regli

Lockheed Martin Advanced Technology Laboratories

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge