Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alissa L. Russ is active.

Publication


Featured researches published by Alissa L. Russ.


BMJ Quality & Safety | 2013

The science of human factors: separating fact from fiction

Alissa L. Russ; Rollin J. Fairbanks; Ben-Tzion Karsh; Laura G. Militello; Jason J. Saleem; Robert L. Wears

Background Interest in human factors has increased across healthcare communities and institutions as the value of human centred design in healthcare becomes increasingly clear. However, as human factors is becoming more prominent, there is growing evidence of confusion about human factors science, both anecdotally and in scientific literature. Some of the misconceptions about human factors may inadvertently create missed opportunities for healthcare improvement. Methods The objective of this article is to describe the scientific discipline of human factors and provide common ground for partnerships between healthcare and human factors communities. Results The primary goal of human factors science is to promote efficiency, safety and effectiveness by improving the design of technologies, processes and work systems. As described in this article, human factors also provides insight on when training is likely (or unlikely) to be effective for improving patient safety. Finally, we outline human factors specialty areas that may be particularly relevant for improving healthcare delivery and provide examples to demonstrate their value. Conclusions The human factors concepts presented in this article may foster interdisciplinary collaborations to yield new, sustainable solutions for healthcare quality and patient safety.


Journal of the American Medical Informatics Association | 2015

Recommendations to Improve the Usability of Drug-Drug Interaction Clinical Decision Support Alerts

Thomas H. Payne; Lisa E. Hines; Raymond C. Chan; Seth Hartman; Joan Kapusnik-Uner; Alissa L. Russ; Bruce W. Chaffee; Christian Hartman; Victoria Tamis; Brian Galbreth; Peter Glassman; Shobha Phansalkar; Heleen van der Sijs; Sheila M. Gephart; Gordon Mann; Howard R. Strasberg; Amy J. Grizzle; Mary Brown; Gilad J. Kuperman; Chris Steiner; Amanda Kathleen Sullins; Hugh H. Ryan; Michael A. Wittie; Daniel C. Malone

OBJECTIVE To establish preferred strategies for presenting drug-drug interaction (DDI) clinical decision support alerts. MATERIALS AND METHODS A DDI Clinical Decision Support Conference Series included a workgroup consisting of 24 clinical, usability, and informatics experts representing academia, health information technology (IT) vendors, healthcare organizations, and the Office of the National Coordinator for Health IT. Workgroup members met via web-based meetings 12 times from January 2013 to February 2014, and two in-person meetings to reach consensus on recommendations to improve decision support for DDIs. We addressed three key questions: (1) what, how, where, and when do we display DDI decision support? (2) should presentation of DDI decision support vary by clinicians? and (3) how should effectiveness of DDI decision support be measured? RESULTS Our recommendations include the consistent use of terminology, visual cues, minimal text, formatting, content, and reporting standards to facilitate usability. All clinicians involved in the medication use process should be able to view DDI alerts and actions by other clinicians. Override rates are common but may not be a good measure of effectiveness. DISCUSSION Seven core elements should be included with DDI decision support. DDI information should be presented to all clinicians. Finally, in their current form, override rates have limited capability to evaluate alert effectiveness. CONCLUSION DDI clinical decision support alerts need major improvements. We provide recommendations for healthcare organizations and IT vendors to improve the clinician interface of DDI alerts, with the aim of reducing alert fatigue and improving patient safety.


Journal of the American Medical Informatics Association | 2014

Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation.

Alissa L. Russ; Alan J. Zillich; Brittany L. Melton; Scott A. Russell; Siying Chen; Jeffrey R. Spina; Michael W. Weiner; Elizabette Johnson; Joanne Daggy; M. Sue McManus; Jason M. Hawsey; Anthony Puleo; Bradley N. Doebbeling; Jason J. Saleem

OBJECTIVE To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. MATERIALS AND METHODS We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug-allergy, drug-drug interaction, and drug-disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. RESULTS Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1-5) compared to original alerts: 4 (1-7); p=0.024). DISCUSSION Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. CONCLUSIONS This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes.


Journal of the American Medical Informatics Association | 2014

You and me and the computer makes three: variations in exam room use of the electronic health record.

Jason J. Saleem; Mindy E. Flanagan; Alissa L. Russ; Carmit K. McMullen; Leora Elli; Scott A. Russell; Katelyn Bennett; Marianne S. Matthias; Shakaib U. Rehman; Mark D. Schwartz; Richard M. Frankel

Challenges persist on how to effectively integrate the electronic health record (EHR) into patient visits and clinical workflow, while maintaining patient-centered care. Our goal was to identify variations in, barriers to, and facilitators of the use of the US Department of Veterans Affairs (VA) EHR in ambulatory care workflow in order better to understand how to integrate the EHR into clinical work. We observed and interviewed 20 ambulatory care providers across three geographically distinct VA medical centers. Analysis revealed several variations in, associated barriers to, and facilitators of EHR use corresponding to different units of analysis: computer interface, team coordination/workflow, and organizational. We discuss our findings in the context of different units of analysis and connect variations in EHR use to various barriers and facilitators. Findings from this study may help inform the design of the next generation of EHRs for the VA and other healthcare systems.


Journal of the American Medical Informatics Association | 2013

Paper- and computer-based workarounds to electronic health record use at three benchmark institutions

Mindy E. Flanagan; Jason J. Saleem; Laura G Millitello; Alissa L. Russ; Bradley N. Doebbeling

BACKGROUND Healthcare professionals develop workarounds rather than using electronic health record (EHR) systems. Understanding the reasons for workarounds is important to facilitate user-centered design and alignment between work context and available health information technology tools. OBJECTIVE To examine both paper- and computer-based workarounds to the use of EHR systems in three benchmark institutions. METHODS Qualitative data were collected in 11 primary care outpatient clinics across three healthcare institutions. Data collection methods included direct observation and opportunistic questions. In total, 120 clinic staff and providers and 118 patients were observed. All data were analyzed using previously developed workaround categories and examined for potential new categories. Additionally, workarounds were coded as either paper- or computer-based. RESULTS Findings corresponded to 10 of 11 workaround categories identified in previous research. All 10 of these categories applied to paper-based workarounds; five categories also applied to computer-based workarounds. One new category, no correct path (eg, a desired option did not exist in the computer interface, precipitating a workaround), was identified for computer-based workarounds. The most consistent reasons for workarounds across the three institutions were efficiency, memory, and awareness. CONCLUSIONS Consistent workarounds across institutions suggest common challenges in outpatient clinical settings and failures to accommodate these challenges in EHR design. An examination of workarounds provides insight into how providers adapt to limiting EHR systems. Part of the design process for computer interfaces should include user-centered methods particular to providers and healthcare settings to ensure uptake and usability.


Health Informatics Journal | 2010

Electronic health information in use: Characteristics that support employee workflow and patient care

Alissa L. Russ; Jason J. Saleem; Connie Justice; Heather Woodward-Hagg; Peter Woodbridge; Bradley N. Doebbeling

The aim of this investigation was to assess helpful and challenging aspects of electronic health information with respect to clinical workflow and identify a set of characteristics that support patient care processes. We conducted 20 semi-structured interviews at a Veterans Affairs Medical Center, with a fully implemented electronic health record (EHR), and elicited positive and negative examples of how information technology (IT) affects the work of healthcare employees. Responses naturally shed light on information characteristics that aid work processes. We performed a secondary analysis on interview data and inductively identified characteristics of electronic information that support healthcare workflow. Participants provided 199 examples of how electronic information affects workflow. Seventeen characteristics emerged along with four primary domains: trustworthy and reliable; ubiquitous; effectively displayed; and adaptable to work demands. Each characteristic may be used to help evaluate health information technology pre- and post-implementation. Results provide several strategies to improve EHR design and implementation to better support healthcare workflow.


The Joint Commission Journal on Quality and Patient Safety | 2012

Design and Implementation of a Hospital-Based Usability Laboratory: Insights from a Department of Veterans Affairs Laboratory for Health Information Technology

Alissa L. Russ; Michael W. Weiner; Scott A. Russell; Darrell A. Baker; W. Jeffrey Fahner; Jason J. Saleem

BACKGROUND Although the potential benefits of more usable health information technologies (HIT) are substantial-reduced HIT support costs, increased work efficiency, and improved patient safety--human factors methods to improve usability are rarely employed. The US Department of Veterans Affairs (VA) has emerged as an early leader in establishing usability laboratories to inform the design of HIT, including its electronic health record. Experience with a usability laboratory at a VA Medical Center provides insights on how to design, implement, and leverage usability laboratories in the health care setting. IMPLEMENTATION The VA Health Services Research and Development Service Human-Computer Interaction & Simulation Laboratory emerged as one of the first VA usability laboratories and was intended to provide research-based findings about HIT designs. This laboratory supports rapid prototyping, formal usability testing, and analysis tools to assess existing technologies, alternative designs, and potential future technologies. RESULTS OF IMPLEMENTATION: Although the laboratory has maintained a research focus, it has become increasingly integrated with VA operations, both within the medical center and on a national VA level. With this resource, data-driven recommendations have been provided for the design of HIT applications before and after implementation. CONCLUSION The demand for usability testing of HIT is increasing, and information on how to develop usability laboratories for the health care setting is often needed. This article may assist other health care organizations that want to invest in usability resources to improve HIT. The establishment and utilization of usability laboratories in the health care setting may improve HIT designs and promote safe, high-quality care for patients.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

Paper Persistence and Computer-based Workarounds with the Electronic Health Record in Primary Care

Jason J. Saleem; Mindy E. Flanagan; Laura G. Militello; Nicole B. Arbuckle; Alissa L. Russ; A. Lucile Burgo-Black; Bradley N. Doebbeling

With the United States national goal and incentive program to transition from paper to electronic health records (EHRs), healthcare organizations are increasingly implementing EHRs and other related health information technology (IT). However, in institutions which have long adopted these computerized systems, such as the Veterans Health Administration, healthcare workers continue to rely on paper to complete their work. Furthermore, insufficient EHR design also results in computer-based workarounds. Using direct observation with opportunistic interviewing, we investigated the use of paper- and computer-based workarounds to the EHR with a multi-site study of 54 healthcare workers, including primary care providers, nurses, and other healthcare staff. Our analysis revealed several paper- and computer-based workarounds to the VA’s EHR. These workarounds, including clinician-designed information tools, provide evidence for how to enhance the design of the EHR to better support the needs of clinicians.


Psychiatric Services | 2016

Comparative Effectiveness of a Burnout Reduction Intervention for Behavioral Health Providers

Angela L. Rollins; Marina Kukla; Gary A. Morse; Louanne W. Davis; Michael P. Leiter; Maria Monroe-DeVita; Mindy E. Flanagan; Alissa L. Russ; Sara Wasmuth; Johanne Eliacin; Linda A. Collins; Michelle P. Salyers

OBJECTIVES Prior research found preliminary effectiveness for Burnout Reduction: Enhanced Awareness, Tools, Handouts, and Education (BREATHE), a daylong workshop for reducing burnout among behavioral health providers. Using a longer follow-up compared with prior research, this study compared the effectiveness of BREATHE and a control condition. METHODS Behavioral health providers (N=145) from three U.S. Department of Veterans Affairs facilities and two social service agencies were randomly assigned to BREATHE or person-centered treatment planning. Burnout and other outcomes were compared across groups over time. RESULTS Analyses yielded no significant differences between groups. However, BREATHE participants showed small but statistically significant improvements in cynicism (six weeks) and in emotional exhaustion and positive expectations for clients (six months). Participants in the control condition showed no significant changes over time. CONCLUSIONS Although it did not demonstrate comparative effectiveness versus a control condition, BREATHE could be strengthened and targeted toward both distressed providers and their organizations.


Annals of Pharmacotherapy | 2015

Design and Evaluation of an Electronic Override Mechanism for Medication Alerts to Facilitate Communication Between Prescribers and Pharmacists

Alissa L. Russ; Siying Chen; Brittany L. Melton; Jason J. Saleem; Michael W. Weiner; Jeffrey R. Spina; Joanne K. Daggy; Alan J. Zillich

Background: Computerized medication alerts can often be bypassed by entering an override rationale, but prescribers’ override reasons are frequently ambiguous to pharmacists who review orders. Objective: To develop and evaluate a new override mechanism for adverse reaction and drug-drug interaction alerts. We hypothesized that the new mechanism would improve usability for prescribers and increase the clinical appropriateness of override reasons. Methods: A counterbalanced, crossover study was conducted with 20 prescribers in a simulated prescribing environment. We modified the override mechanism timing, navigation, and text entry. Instead of free-text entry, the new mechanism presented prescribers with a predefined set of override reasons. We assessed usability (learnability, perceived efficiency, and usability errors) and used a priori criteria to evaluate the clinical appropriateness of override reasons entered. Results: Prescribers rated the new mechanism as more efficient (Wilcoxon signed-rank test, P = 0.032). When first using the new design, 5 prescribers had difficulty finding the new mechanism, and 3 interpreted the navigation to mean that the alert could not be overridden. The number of appropriate override reasons significantly increased with the new mechanism compared with the original mechanism (median change of 3.0; interquartile range = 3.0; P < 0.0001). Conclusions: When prescribers were given a menu-based choice for override reasons, clinical appropriateness of these reasons significantly improved. Further enhancements are necessary, but this study is an important first step toward a more standardized menu of override choices. Findings may be used to improve communication through e-prescribing systems between prescribers and pharmacists.

Collaboration


Dive into the Alissa L. Russ's collaboration.

Top Co-Authors

Avatar

Jason J. Saleem

Veterans Health Administration

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Scott A. Russell

Veterans Health Administration

View shared research outputs
Top Co-Authors

Avatar

Himalaya Patel

Veterans Health Administration

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge