Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Scott A. Russell is active.

Publication


Featured researches published by Scott A. Russell.


Journal of the American Medical Informatics Association | 2014

Applying human factors principles to alert design increases efficiency and reduces prescribing errors in a scenario-based simulation.

Alissa L. Russ; Alan J. Zillich; Brittany L. Melton; Scott A. Russell; Siying Chen; Jeffrey R. Spina; Michael W. Weiner; Elizabette Johnson; Joanne Daggy; M. Sue McManus; Jason M. Hawsey; Anthony Puleo; Bradley N. Doebbeling; Jason J. Saleem

OBJECTIVE To apply human factors engineering principles to improve alert interface design. We hypothesized that incorporating human factors principles into alerts would improve usability, reduce workload for prescribers, and reduce prescribing errors. MATERIALS AND METHODS We performed a scenario-based simulation study using a counterbalanced, crossover design with 20 Veterans Affairs prescribers to compare original versus redesigned alerts. We redesigned drug-allergy, drug-drug interaction, and drug-disease alerts based upon human factors principles. We assessed usability (learnability of redesign, efficiency, satisfaction, and usability errors), perceived workload, and prescribing errors. RESULTS Although prescribers received no training on the design changes, prescribers were able to resolve redesigned alerts more efficiently (median (IQR): 56 (47) s) compared to the original alerts (85 (71) s; p=0.015). In addition, prescribers rated redesigned alerts significantly higher than original alerts across several dimensions of satisfaction. Redesigned alerts led to a modest but significant reduction in workload (p=0.042) and significantly reduced the number of prescribing errors per prescriber (median (range): 2 (1-5) compared to original alerts: 4 (1-7); p=0.024). DISCUSSION Aspects of the redesigned alerts that likely contributed to better prescribing include design modifications that reduced usability-related errors, providing clinical data closer to the point of decision, and displaying alert text in a tabular format. Displaying alert text in a tabular format may help prescribers extract information quickly and thereby increase responsiveness to alerts. CONCLUSIONS This simulation study provides evidence that applying human factors design principles to medication alerts can improve usability and prescribing outcomes.


Journal of the American Medical Informatics Association | 2014

You and me and the computer makes three: variations in exam room use of the electronic health record.

Jason J. Saleem; Mindy E. Flanagan; Alissa L. Russ; Carmit K. McMullen; Leora Elli; Scott A. Russell; Katelyn Bennett; Marianne S. Matthias; Shakaib U. Rehman; Mark D. Schwartz; Richard M. Frankel

Challenges persist on how to effectively integrate the electronic health record (EHR) into patient visits and clinical workflow, while maintaining patient-centered care. Our goal was to identify variations in, barriers to, and facilitators of the use of the US Department of Veterans Affairs (VA) EHR in ambulatory care workflow in order better to understand how to integrate the EHR into clinical work. We observed and interviewed 20 ambulatory care providers across three geographically distinct VA medical centers. Analysis revealed several variations in, associated barriers to, and facilitators of EHR use corresponding to different units of analysis: computer interface, team coordination/workflow, and organizational. We discuss our findings in the context of different units of analysis and connect variations in EHR use to various barriers and facilitators. Findings from this study may help inform the design of the next generation of EHRs for the VA and other healthcare systems.


Journal of Rehabilitation Research and Development | 2009

Comparison of two approaches to screen for dysphagia among acute ischemic stroke patients: Nursing admission screening tool versus National Institutes of Health Stroke Scale

Dawn M. Bravata; Virginia Daggett; Heather Woodward-Hagg; Teresa M. Damush; Laurie Plue; Scott A. Russell; George Allen; Linda S. Williams; Jaroslaw Harezlak; Neale R. Chumbler

This study assessed the positive and negative predictive values and the sensitivity and specificity of a nursing dysphagia screening tool and the National Institutes of Health Stroke Scale (NIHSS) for the identification of dysphagia for veterans hospitalized with ischemic stroke.A secondary objective of this study was to evaluate the speech-language pathology consult rate before and after the nursing admission dysphagia screening tool. This retrospective cohort study evaluated veterans admitted to one Department of Veterans Affairs medical center with ischemic stroke during the 6 months both before and after the implementation of a nursing dysphagia screening tool, which was part of the admission nursing template. Stroke severity was measured with the use of the retrospective NIHSS. Dysphagia diagnosis was based on speech-language pathology evaluations.Dysphagia was present in 38 of 101 patients (38%) with ischemic stroke. The nursing dysphagia screening tool had a positive predictive value of 50% and a negative predictive value of 68%, with a sensitivity of 29% and specificity of 84%. The use of the NIHSS to identify dysphagia risk had a positive predictive value of 60% and a negative predictive value of 84%.The NIHSS had better test characteristics in predicting dysphagia than the nursing dysphagia screening tool. Future research should evaluate the use of the NIHSS as a screening tool for dysphagia.


The Joint Commission Journal on Quality and Patient Safety | 2012

Design and Implementation of a Hospital-Based Usability Laboratory: Insights from a Department of Veterans Affairs Laboratory for Health Information Technology

Alissa L. Russ; Michael W. Weiner; Scott A. Russell; Darrell A. Baker; W. Jeffrey Fahner; Jason J. Saleem

BACKGROUND Although the potential benefits of more usable health information technologies (HIT) are substantial-reduced HIT support costs, increased work efficiency, and improved patient safety--human factors methods to improve usability are rarely employed. The US Department of Veterans Affairs (VA) has emerged as an early leader in establishing usability laboratories to inform the design of HIT, including its electronic health record. Experience with a usability laboratory at a VA Medical Center provides insights on how to design, implement, and leverage usability laboratories in the health care setting. IMPLEMENTATION The VA Health Services Research and Development Service Human-Computer Interaction & Simulation Laboratory emerged as one of the first VA usability laboratories and was intended to provide research-based findings about HIT designs. This laboratory supports rapid prototyping, formal usability testing, and analysis tools to assess existing technologies, alternative designs, and potential future technologies. RESULTS OF IMPLEMENTATION: Although the laboratory has maintained a research focus, it has become increasingly integrated with VA operations, both within the medical center and on a national VA level. With this resource, data-driven recommendations have been provided for the design of HIT applications before and after implementation. CONCLUSION The demand for usability testing of HIT is increasing, and information on how to develop usability laboratories for the health care setting is often needed. This article may assist other health care organizations that want to invest in usability resources to improve HIT. The establishment and utilization of usability laboratories in the health care setting may improve HIT designs and promote safe, high-quality care for patients.


Health Informatics Journal | 2014

Accessibility, usability, and usefulness of a Web-based clinical decision support tool to enhance provider–patient communication around Self-management TO Prevent (STOP) Stroke

Jane A. Anderson; Kyler M. Godwin; Jason J. Saleem; Scott A. Russell; Joshua J. Robinson; Barbara Kimmel

This article reports redesign strategies identified to create a Web-based user-interface for the Self-management TO Prevent (STOP) Stroke Tool. Members of a Stroke Quality Improvement Network (N = 12) viewed a visualization video of a proposed prototype and provided feedback on implementation barriers/facilitators. Stroke-care providers (N = 10) tested the Web-based prototype in think-aloud sessions of simulated clinic visits. Participants’ dialogues were coded into themes. Access to comprehensive information and the automated features/systematized processes were the primary accessibility and usability facilitator themes. The need for training, time to complete the tool, and computer-centric care were identified as possible usability barriers. Patient accountability, reminders for best practice, goal-focused care, and communication/counseling themes indicate that the STOP Stroke Tool supports the paradigm of patient-centered care. The STOP Stroke Tool was found to prompt clinicians on secondary stroke-prevention clinical-practice guidelines, facilitate comprehensive documentation of evidence-based care, and support clinicians in providing patient-centered care through the shared decision-making process that occurred while using the action-planning/goal-setting feature of the tool.


54th Human Factors and Ergonomics Society Annual Meeting 2010, HFES 2010 | 2010

A Novel Tool to Track and Analyze Qualitative Usability Data: Lessons Learned from the VA's Personal Health Record

Scott A. Russell; Jason J. Saleem; David A. Haggstrom; Alissa L. Russ; Neale R. Chumbler

We developed a User-Testing Database to be able to process a greater amount of user data, from multiple sources of data, at a much finer level of granularity, and to be able to aid in a more sophisticated analysis, including specific queries of usability data than a typical “manual”-based usability analysis. In this paper, we demonstrate our User-Testing Database as applied to a usability assessment of the Veterans Affairs (VA) My HealtheVet personal health record. The usability test included 24 Veterans who completed a series of scenarios in the usability lab at a Midwest Veterans Affairs Medical Center (VAMC). The User-Testing Database facilitated reduction of data gathered from video review, facilitator notes, and debrief notes into 1160 observations that were sorted into conceptual bins and summarized for the designers of the personal health record. From creation of the database to completion of the reports took four months and did not require extensive knowledge of qualitative analysis techniques. We argue a User-Testing Database can allow other usability studies to increase the number of participants and the granularity of the data without prohibitively increasing the amount of time and experience required to process the data gathered.


Journal of Nursing Care Quality | 2015

Health care systems redesign project to improve dysphagia screening.

Virginia S. Daggett; Heather Woodward-Hagg; Teresa M. Damush; Laurie Plue; Scott A. Russell; George Allen; Linda S. Williams; Neale R. Chumbler; Dawn M. Bravata

The purpose of this project was to improve dysphagia-screening processes in a tertiary Veterans Affairs Medical Center. The dysphagia-screening tool was redesigned on the basis of frontline clinician feedback, clinical guidelines, user satisfaction, and multidisciplinary expertise. The revised tool triggered a speech-language consult for positive screens and demonstrated higher scores in user satisfaction and task efficiency. Systems redesign processes were effective for redesigning the tool and implementing practice changes with clinicians involved in dysphagia screening.


Journal of the American Medical Informatics Association | 2011

Lessons learned from usability testing of the VA's personal health record.

David A. Haggstrom; Jason J. Saleem; Alissa L. Russ; Josette Jones; Scott A. Russell; Neale R. Chumbler


The American Journal of Medicine | 2015

Reducing Prescribing Errors Through Creatinine Clearance Alert Redesign

Brittany L. Melton; Alan J. Zillich; Scott A. Russell; Michael W. Weiner; M. Sue McManus; Jeffrey R. Spina; Alissa L. Russ


AMIA | 2013

Development of Standardized Patient Scenarios for Usability Testing of Medication Alerts.

Brittany L. Melton; Jeffery R. Spina; Alan J. Zillich; Jason J. Saleem; Michael W. Weiner; Scott A. Russell; Siying Chen; Alissa L. Russ

Collaboration


Dive into the Scott A. Russell's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge