David K. Vawdrey
University of Utah
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David K. Vawdrey.
Journal of the American Medical Informatics Association | 2007
David K. Vawdrey; Reed M. Gardner; R. Scott Evans; James F. Orme; Terry P. Clemmer; Loren W. Greenway; Frank A. Drews
OBJECTIVEnTo evaluate the data quality of ventilator settings recorded by respiratory therapists using a computer charting application and assess the impact of incorrect data on computerized ventilator management protocols. DESIGN An analysis of 29,054 charting events gathered over 12 months from 678 ventilated patients (1,736 ventilator days) in four intensive care units at a tertiary care hospital.nnnMEASUREMENTSnTen ventilator settings were examined, including fraction of inspired oxygen (Fio (2)), positive end-expiratory pressure (PEEP), tidal volume, respiratory rate, peak inspiratory flow, and pressure support. Respiratory therapists entered values for each setting approximately every two hours using a computer charting application. Manually entered values were compared with data acquired automatically from ventilators using an implementation of the ISO/IEEE 11073 Medical Information Bus (MIB). Data quality was assessed by measuring the percentage of time that the two sources matched. Charting delay, defined as the interval between data observation and data entry, also was measured.nnnRESULTSnThe percentage of time that settings matched ranged from 99.0% (PEEP) to 75.9% (low tidal volume alarm setting). The average charting delay for each charting event was 6.1 minutes, including an average of 1.8 minutes spent entering data in the charting application. In 559 (3.9%) of 14,263 suggestions generated by computerized ventilator management protocols, one or more manually charted setting values did not match the MIB data.nnnCONCLUSIONnEven at institutions where manual charting of ventilator settings is performed well, automatic data collection can eliminate delays, improve charting efficiency, and reduce errors caused by incorrect data.
Academic Medicine | 2016
Lena Mamykina; David K. Vawdrey; George Hripcsak
Purpose To understand how much time residents spend using computers compared with other activities, and what residents use computers for. Method This time and motion study was conducted in June and July 2010 at NewYork-Presbyterian/Columbia University Medical Center with seven residents (first-, second-, and third-year) on the general medicine service. An experienced observer shadowed residents during a single day shift, captured all their activities using an iPad application, and took field notes. The activities were captured using a validated taxonomy of clinical activities, expanded to describe computer-based activities with a greater level of detail. Results Residents spent 364.5 minutes (50.6%) of their shift time using computers, compared with 67.8 minutes (9.4%) interacting with patients. In addition, they spent 292.3 minutes (40.6%) talking with others in person, 186.0 minutes (25.8%) handling paper notes, 79.7 minutes (11.1%) in rounds, 80.0 minutes (11.1%) walking or waiting, and 54.0 minutes (7.5%) talking on the phone. Residents spent 685 minutes (59.6%) multitasking. Computer-based documentation activities amounted to 189.9 minutes (52.1%) of all computer-based activities time, with 128.7 minutes (35.3%) spent writing notes and 27.3 minutes (7.5%) reading notes composed by others. Conclusions The study showed that residents spent considerably more time interacting with computers (over 50% of their shift time) than in direct contact with patients (less than 10% of their shift time). Some of this may be due to an increasing reliance on computing systems for access to patient data, further exacerbated by inefficiencies in the design of the electronic health record.
Journal of the American Medical Informatics Association | 2005
R. Scott Evans; Kyle V. Johnson; Vrena B. Flint; Tupper Kinder; Charles R. Lyon; William L. Hawley; David K. Vawdrey; George E. Thomsen
Mechanical ventilators are designed to generate alarms when patients become disconnected or experience other critical ventilator events. However, these alarms can blend in with other accustomed sounds of the intensive care unit. Ventilator alarms that go unnoticed for extended periods of time often result in permanent patient harm or death. We developed a system to monitor critical ventilator events through our existing hospital network. Whenever an event is identified, the new system takes control of every computer in the patients intensive care unit and generates an enhanced audio and visual alert indicating that there is a critical ventilator event and identifies the room number. Once the alert is acknowledged or the event is corrected, all the computers are restored back to the pre-alert status and/or application. This paper describes the development and implementation of this system and reports the initial results, user acceptance, and the increase in valuable information and patient safety.
american medical informatics association annual symposium | 2013
David K. Vawdrey; Daniel M. Stein; Matthew R. Fred; Susan Bostwick; Peter D. Stetson
american medical informatics association annual symposium | 2010
Daniel M. Stein; David K. Vawdrey; Peter D. Stetson; Suzanne Bakken
american medical informatics association annual symposium | 2009
Adam B. Wilcox; David K. Vawdrey; Yueh-Hsia Chen; Bruce Forman; George Hripcsak
american medical informatics association annual symposium | 2008
David K. Vawdrey
american medical informatics association annual symposium | 2015
Fernanda Polubriaginof; Nicholas P. Tatonetti; David K. Vawdrey
american medical informatics association annual symposium | 2014
Jennifer E. Prey; S. Restaino; David K. Vawdrey
AMIA Joint Summits on Translational Science proceedings. AMIA Joint Summits on Translational Science | 2014
David K. Vawdrey; Chunhua Weng; David Herion; James J. Cimino