Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where A. Zachary Hettinger is active.

Publication


Featured researches published by A. Zachary Hettinger.


Journal of the American Medical Informatics Association | 2015

Novel user interface design for medication reconciliation: an evaluation of Twinlist.

Catherine Plaisant; Johnny Wu; A. Zachary Hettinger; Seth M. Powsner; Ben Shneiderman

OBJECTIVE The primary objective was to evaluate time, number of interface actions, and accuracy on medication reconciliation tasks using a novel user interface (Twinlist, which lays out the medications in five columns based on similarity and uses animation to introduce the grouping - www.cs.umd.edu/hcil/sharp/twinlist) compared to a Control interface (where medications are presented side by side in two columns). A secondary objective was to assess participant agreement with statements regarding clarity and utility and to elicit comparisons. MATERIAL AND METHODS A 1 × 2 within-subjects experimental design was used with interface (Twinlist or Control) as an independent variable; time, number of clicks, scrolls, and errors were used as dependent variables. Participants were practicing medical providers with experience performing medication reconciliation but no experience with Twinlist. They reconciled two cases in each interface (in a counterbalanced order), then provided feedback on the design of the interface. RESULTS Twenty medical providers participated in the study for a total of 80 trials. The trials using Twinlist were statistically significantly faster (18%), with fewer clicks (40%) and scrolls (60%). Serious errors were noted 12 and 31 times in Twinlist and Control trials, respectively. DISCUSSION Trials using Twinlist were faster and more accurate. Subjectively, participants rated Twinlist more favorably than Control. They valued the novel layout of the drugs, but indicated that the included animation would be valuable for novices, but not necessarily for advanced users. Additional feedback from participants provides guidance for further development and clinical implementations. CONCLUSIONS Cognitive support of medication reconciliation through interface design can significantly improve performance and safety.


Journal of Healthcare Risk Management | 2013

An evidence‐based toolkit for the development of effective and sustainable root cause analysis system safety solutions

A. Zachary Hettinger; Rollin J. Fairbanks; Sudeep Hegde; Alexandra S. Rackoff; John Wreathall; Vicki L. Lewis; Ann M. Bisantz; Robert L. Wears

Root cause analysis (RCA) after adverse events in healthcare is a standard practice at many institutions. However, healthcare has failed to see a dramatic improvement in patient safety over the last decade. In order to improve the RCA process, this study used systems safety science, which is based partly on human factors engineering principles and has been applied with success in other high-risk industries like aviation. A multi-institutional dataset of 334 RCA cases and 782 solutions was analyzed using qualitative methods. A team of safety science experts developed a model of 13 RCA solutions categories through an iterative process, using semi-structured interview data from 44 frontline staff members from 7 different hospital-based unit types. These categories were placed in a model and toolkit to help guide RCA teams in developing sustainable and effective solutions to prevent future adverse events. This study was limited by its retrospective review of cases and use of interviews rather than clinical observations. In conclusion, systems safety principles were used to develop guidelines for RCA teams to promote systems-level sustainable and effective solutions for adverse events.


Journal of Cognitive Engineering and Decision Making | 2015

Assessment of Innovative Emergency Department Information Displays in a Clinical Simulation Center

Nicolette M. McGeorge; Sudeep Hegde; Rebecca L. Berg; Theresa K. Guarrera-Schick; David LaVergne; Sabrina Casucci; A. Zachary Hettinger; Lindsey Clark; Li Lin; Rollin J. Fairbanks; Natalie C. Benda; Longsheng Sun; Robert L. Wears; Shawna J. Perry; Ann M. Bisantz

The objective of this work was to assess the functional utility of new display concepts for an emergency department information system created using cognitive systems engineering methods, by comparing them to similar displays currently in use. The display concepts were compared to standard displays in a clinical simulation study during which nurse-physician teams performed simulated emergency department tasks. Questionnaires were used to assess the cognitive support provided by the displays, participants’ level of situation awareness, and participants’ workload during the simulated tasks. Participants rated the new displays significantly higher than the control displays in terms of cognitive support. There was no significant difference in workload scores between the display conditions. There was no main effect of display type on situation awareness, but there was a significant interaction; participants using the new displays showed improved situation awareness from the middle to the end of the session. This study demonstrates that cognitive systems engineering methods can be used to create innovative displays that better support emergency medicine tasks, without increasing workload, compared to more standard displays. These methods provide a means to develop emergency department information systems—and more broadly, health information technology—that better support the cognitive needs of healthcare providers.


Journal of Biomedical Informatics | 2015

Exploring methods for identifying related patient safety events using structured and unstructured data

Allan Fong; A. Zachary Hettinger; Raj M. Ratwani

Most healthcare systems have implemented patient safety event reporting systems to identify safety hazards. Searching the safety event data to find related patient safety reports and identify trends is challenging given the complexity and quantity of these reports. Structured data elements selected by the event reporter may be inaccurate and the free-text narrative descriptions are difficult to analyze. In this paper we present and explore methods for utilizing both the unstructured free-text and structured data elements in safety event reports to identify and rank similar events. We evaluate the results of three different free-text search methods, including a unique topic modeling adaptation, and structured element weights, using a patient fall use case. The various search techniques and weight combinations tended to prioritize different aspects of the event reports leading to different search and ranking results. These search and prioritization methods have the potential to greatly improve patient safety officers, and other healthcare workers, understanding of which safety event reports are related.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2014

Identifying Interruption Clusters in the Emergency Department

Allan Fong; Margaret Meadors; Neil Batta; Mike Nitzberg; A. Zachary Hettinger; Raj M. Ratwani

Interruptions can adversely affect human performance, particularly in fast-paced and high-risk environments. Much of the research on interruptions has been laboratory based and the extension of these methods to real-world settings has been challenging and limited. This paper discusses the development and usage of a new tool, TaskTracker, to increase understanding of interruptions in the emergency department. With the data collected from this tool we identified several temporal groupings of interruptions, what we define as interruption clusters. We found significantly more clusters during self-initiated computer tasks. In this setting, we also observed the tendencies of assistants, technicians, students, and nurses to interrupt attending physicians in clusters. A deeper understanding of who engages in interruption clusters and why may provide insights for future systemic strategies that could facilitate better communication patterns.


Proceedings of the International Symposium on Human Factors and Ergonomics in Health Care | 2014

Usability evaluation and assessment of a novel emergency department IT system developed using a cognitive systems engineering approach

Lindsey Clark; Theresa K. Guarrera; Nicolette M. McGeorge; A. Zachary Hettinger; Angelica Hernandez; David LaVergne; Natalie C. Benda; Shawna J. Perry; Robert L. Wears; Rollin J. Fairbanks; Ann M. Bisantz

This paper presents the results of a usability evaluation conducted of an electronic Emergency Department information system (EDIS) prototype that was designed using a cognitive system engineering (CSE) approach. Participants were asked to complete tasks using the EDIS prototype, while thinking aloud about their interactions with the displays. Participants also completed subjective assessments of the system that related to 1) cognitive performance support objectives, 2) usability, usefulness, and frequency of system use, and 3) qualitative feedback. Mean scores were calculated for cognitive performance support objectives as well as usability, usefulness, and frequency of use for each display. Mean scores for all cognitive performance support objectives were six or higher. The mean usability score was five or higher, and the mean usefulness score was six or higher. Displays that provided information on individual patient status had the highest scores. Results from this evaluation regarding positive ratings for the cognitive support objectives provide evidence that CSE design methods can be used to understand the goals and objectives of medical work domains.


Journal of Biomedical Informatics | 2017

Cognitive engineering and health informatics: Applications and intersections

A. Zachary Hettinger; Emilie Roth; Ann M. Bisantz

Cognitive engineering is an applied field with roots in both cognitive science and engineering that has been used to support design of information displays, decision support, human-automation interaction, and training in numerous high risk domains ranging from nuclear power plant control to transportation and defense systems. Cognitive engineering provides a set of structured, analytic methods for data collection and analysis that intersect with and complement methods of Cognitive Informatics. These methods support discovery of aspects of the work that make performance challenging, as well as the knowledge, skills, and strategies that experts use to meet those challenges. Importantly, cognitive engineering methods provide novel representations that highlight the inherent complexities of the work domain and traceable links between the results of cognitive analyses and actionable design requirements. This article provides an overview of relevant cognitive engineering methods, and illustrates how they have been applied to the design of health information technology (HIT) systems. Additionally, although cognitive engineering methods have been applied in the design of user-centered informatics systems, methods drawn from informatics are not typically incorporated into a cognitive engineering analysis. This article presents a discussion regarding ways in which data-rich methods can inform cognitive engineering.


Journal of the American Medical Informatics Association | 2016

A framework for evaluating electronic health record vendor user-centered design and usability testing processes.

Raj M. Ratwani; A. Zachary Hettinger; Allison Kosydar; Rollin J. Fairbanks; Michael L. Hodgkins

Objective: Currently, there are few resources for electronic health record (EHR) purchasers and end users to understand the usability processes employed by EHR vendors during product design and development. We developed a framework, based on human factors literature and industry standards, to systematically evaluate the user-centered design processes and usability testing methods used by EHR vendors. Materials and Methods: We reviewed current usability certification requirements and the human factors literature to develop a 15-point framework for evaluating EHR products. The framework is based on 3 dimensions: user-centered design process, summative testing methodology, and summative testing results. Two vendor usability reports were retrieved from the Office of the National Coordinator’s Certified Health IT Product List and were evaluated using the framework. Results: One vendor scored low on the framework (5 pts) while the other vendor scored high on the framework (15 pts). The 2 scored vendor reports demonstrate the framework’s ability to discriminate between the variabilities in vendor processes and to determine which vendors are meeting best practices. Discussion: The framework provides a method to more easily comprehend EHR vendors’ usability processes and serves to highlight where EHR vendors may be falling short in terms of best practices. The framework provides a greater level of transparency for both purchasers and end users of EHRs. Conclusion: The framework highlights the need for clearer certification requirements and suggests that the authorized certification bodies that examine vendor usability reports may need to be provided with clearer guidance.


Journal of the American Medical Informatics Association | 2016

Identifying visual search patterns in eye gaze data; gaining insights into physician visual workflow

Allan Fong; Daniel J. Hoffman; A. Zachary Hettinger; Rollin J. Fairbanks; Ann M. Bisantz

IMPORTANCE AND OBJECTIVES As health information technologies become more prevalent in physician workflow, it is increasingly important to understand how physicians are using and interacting with these systems. This includes understanding how physicians search for information presented through health information technology systems. Eye tracking technologies provide a useful technique to understand how physicians visually search for information. However, analyzing eye tracking data can be challenging and is often done by measuring summative metrics, such as total time looking at a specific area and first-order transitions. METHODS In this paper, we propose an algorithmic approach to identify different visual search patterns. We demonstrate this approach by identifying common visual search patterns from physicians using a simulated prototype emergency department patient tracking system. RESULTS AND CONCLUSIONS We evaluate and compare the visual search pattern results to first-order transition results. We discuss the benefits and limitations of this approach and insights from this initial evaluation.


Journal of the American Medical Informatics Association | 2016

Barriers to comparing the usability of electronic health records.

Raj M. Ratwani; A. Zachary Hettinger; Rollin J. Fairbanks

Despite the widespread adoption of electronic health records (EHRs), usability of many EHRs continues to be suboptimal, with some vendors failing to meet usability standards, resulting in clinician frustration and patient safety hazards. In an effort to increase EHR vendor competition on usability, recommendations have been made and legislation drafted to develop comparison tools that would allow purchasers to better understand the usability of EHR products prior to purchase. Usability comparison can be based on EHR vendor design and development processes, vendor usability testing as part of the Office of the National Coordinator for Health Information Technology certification program, and usability of implemented products. Barriers exist within the current certified health technology program that prevent effective comparison of usability during each of these stages. We describe the importance of providing purchasers with improved information about EHR usability, barriers to making usability comparisons, and solutions to overcome these barriers.

Collaboration


Dive into the A. Zachary Hettinger's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Natalie C. Benda

State University of New York System

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shawna J. Perry

Virginia Commonwealth University

View shared research outputs
Top Co-Authors

Avatar

Allan Fong

University of Maryland

View shared research outputs
Researchain Logo
Decentralizing Knowledge