Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thomas C. Bailey is active.

Publication


Featured researches published by Thomas C. Bailey.


international conference on embedded networked sensor systems | 2010

Reliable clinical monitoring using wireless sensor networks: experiences in a step-down hospital unit

Octav Chipara; Chenyang Lu; Thomas C. Bailey; Gruia-Catalin Roman

This paper presents the design, deployment, and empirical study of a wireless clinical monitoring system that collects pulse and oxygen saturation readings from patients. The primary contribution of this paper is an in-depth clinical trial that assesses the feasibility of wireless sensor networks for patient monitoring in general hospital units. We present a detailed analysis of the system reliability from a long term hospital deployment over seven months involving 41 patients in a step-down cardiology unit. The network achieved high reliability (median 99.68%, range 95.21% -- 100%). The overall reliability of the system was dominated by sensing reliability of the pulse oximeters (median 80.85%, range 0.46% -- 97.69%). Sensing failures usually occurred in short bursts, although longer periods were also present due to sensor disconnections. We show that the sensing reliability could be significantly improved through oversampling and by implementing a disconnection alarm system that incurs minimal intervention cost. A retrospective data analysis indicated that the system provided sufficient temporal resolution to support the detection of clinical deterioration in three patients who suffered from significant clinical events including transfer to Intensive Care Units. These results indicate the feasibility and promise of using wireless sensor networks for continuous patient monitoring and clinical deterioration detection in general hospital units.


Clinical Infectious Diseases | 2006

Effectiveness of Education and an Antibiotic-Control Program in a Tertiary Care Hospital in Thailand

Anucha Apisarnthanarak; Somwang Danchaivijitr; Thana Khawcharoenporn; Julajak Limsrivilai; Boonyasit Warachan; Thomas C. Bailey; Victoria J. Fraser

BACKGROUND We conducted a study to evaluate the impact of education and an antibiotic-control program on antibiotic-prescribing practices, antibiotic consumption, antimicrobial resistance, and cost of antibiotics in a tertiary care hospital in Thailand. METHODS A study of the year before and the year after the intervention was performed. Inpatient antibiotic prescriptions were prospectively observed. Demographic characteristics, hospital unit, indication for antibiotic prescription, appropriateness of antibiotic use, reasons for inappropriate antibiotic use, antibiotic consumption (i.e., the rate of antibiotic use), bacterial resistance, and antibiotic cost data were collected. Interventions included education, introduction of an antibiogram, use of antibiotic prescription forms, and prescribing controls. RESULTS After the intervention, there was a 24% reduction in the rate of antibiotic prescription (640 vs. 400 prescriptions/1000 admissions; P<.001). The incidence of inappropriate antibiotic use was significantly reduced (42% vs. 20%; P<.001). A sustained reduction in antibiotic use was observed (R2=0.692; P<.001). Rates of use of third-generation cephalosporins (31 vs. 18 defined daily doses [DDDs]/1000 patient-days; P<.001) and glycopeptides (3.2 vs. 2.4 DDDs/1000 patient-days; P=.002) were significantly reduced. Rates of use of cefazolin (3.5 vs. 8.2 DDDs/1000 patient-days; P<.001) and fluoroquinolones (0.68 vs. 1.15 DDDs/1000 patient-days; P<.001) increased. There were no significant changes for other antibiotic classes. Significant reductions in the incidence of infections due to methicillin-resistant Staphylococcus aureus (48% vs. 33.5%; P<.001), extended-spectrum beta -lactamase-producing Escherichia coli (33% vs. 21%; P<.001), extended-spectrum beta -lactamase-producing Klebsiella pneumoniae (30% vs. 20%; P<.001), and third-generation cephalosporin-resistant Acinetobacter baumanii (27% vs. 19%; P<.001) were also observed. Total costs saving were USD 32,231 during the study period. CONCLUSIONS Education and an antibiotic-control program constituted an effective and cost-saving strategy to optimize antibiotic use in a tertiary care center in Thailand.


American Journal of Transplantation | 2003

The Association of Cytomegalovirus Sero‐Pairing with Outcomes and Costs Following Cadaveric Renal Transplantation Prior to the Introduction of Oral Ganciclovir CMV Prophylaxis

Mark A. Schnitzler; Jeffrey A. Lowell; Karen L. Hardinger; Thomas C. Bailey; Daniel C. Brennan

Cytomegalovirus (CMV) is an important cause of morbidity, mortality and cost in cadaveric renal transplantation. This study was designed to document the clinical and economic outcomes associated with donor and recipient CMV sero‐pairing. Data were drawn from the United States Renal Data System (USRDS) on 17 001 cadaveric renal transplant recipients transplanted between 1995 and 1997 with recorded donor and recipient CMV sero‐status. In multivariate analysis, CMV‐seropositive recipients were associated with a significantly higher incidence of delayed graft function, a lower incidence of graft loss, and lower costs than CMV‐seronegative recipients. CMV‐seropositive compared to seronegative donors were associated with significantly higher incidence of CMV disease, graft loss, and higher costs when transplanted into CMV‐seronegative recipients. However, CMV‐seronegative donors into seropositive recipients had no significant association with outcome beyond a higher incidence of CMV disease compared to CMV‐seronegative donor and recipient pairs. The outcomes associated with CMV‐seropositive donors and seronegative recipients call for tailored management strategies which may include avoidance of such mismatching, antiviral therapy, immunization, or modified immunosuppression.


American Journal of Kidney Diseases | 1997

The effects of cytomegalovirus serology on graft and recipient survival in cadaveric renal transplantation: Implications for organ allocation.

Mark A. Schnitzler; Robert S. Woodward; Daniel C. Brennan; Edward L. Spitznagel; William Claiborne Dunagan; Thomas C. Bailey

The potential benefits from allocating donated cadaveric kidneys based on donor and recipient cytomegalovirus (CMV) serology remain controversial. We estimated graft survival and recipient survival using bivariate Kaplan-Meier models and multivariate Cox proportional hazards models for 24,543 first cadaveric renal transplantations performed in the United States between 1989, coinciding with the introduction of ganciclovir, and 1994. The effects of donor and recipient CMV serology were estimated, and the implications of these estimates for CMV-based allocation of cadaveric kidneys were considered. From Kaplan-Meier estimates, the 3-year impact of CMV-seropositive donor kidneys was a 3.6% reduction in graft survival and a 2.4% reduction in recipient survival for CMV-seronegative recipients, and a 3.9% reduction in graft survival and a 3.0% reduction in recipient survival for CMV-seropositive recipients. Multivariate Cox analysis demonstrated an adverse impact of donor CMV seropositivity regardless of recipient CMV status. D-/R- CMV serologic pairs had the best 3-year outcomes, with 73.4% graft survival and 87.7% recipient survival. D+/R+ CMV serologic pairs were found to have the worst 3-year outcomes, with 68.4% graft survival and 83.1% recipient survival, and were significantly worse than D+/R- pairs in terms of recipient survival. The maximum estimated impact of a program allocating donor kidneys to maximize the number of D-/R- CMV serologic pairs, assuming no impact on HLA mismatches, was a 0.1% reduction in aggregate 3-year graft survival and a 0.2% reduction in aggregate recipient survival. An alternative program allocating donor kidneys to minimize the number of D+/R+ pairs had no estimated effect on either graft or recipient survival. We conclude that during the ganciclovir era, CMV continues to have an important impact on first cadaveric renal transplantation. However, even under ideal conditions, CMV-based kidney allocation to either maximize the number of D-/R- pairs or minimize the number of D+/R+ pairs is likely to provide little benefit to the population of cadaveric renal transplant recipients.


Journal of Hospital Medicine | 2013

A trial of a real-time alert for clinical deterioration in patients hospitalized on general medical wards.

Thomas C. Bailey; Yixin Chen; Yi Mao; Chenyang Lu; Gregory Hackmann; Scott T. Micek; Kevin M. Heard; Kelly Faulkner; Marin H. Kollef

BACKGROUND With limited numbers of intensive care unit (ICU) beds available, increasing patient acuity is expected to contribute to episodes of inpatient deterioration on general wards. OBJECTIVE To prospectively validate a predictive algorithm for clinical deterioration in general-medical ward patients, and to conduct a trial of real-time alerts based on this algorithm. DESIGN Randomized, controlled crossover study. SETTING/PATIENTS Academic center with patients hospitalized on 8 general wards between July 2007 and December 2011. INTERVENTIONS Real-time alerts were generated by an algorithm designed to predict the need for ICU transfer using electronically available data. The alerts were sent by text page to the nurse manager on intervention wards. MEASUREMENTS Intensive care unit transfer, hospital mortality, and hospital length of stay. RESULTS Patients meeting the alert threshold were at nearly 5.3-fold greater risk of ICU transfer (95% confidence interval [CI]: 4.6-6.0) than those not satisfying the alert threshold (358 of 2353 [15.2%] vs 512 of 17678 [2.9%]). Patients with alerts were at 8.9-fold greater risk of death (95% CI: 7.4-10.7) than those without alerts (244 of 2353 [10.4%] vs 206 of 17678 [1.2%]). Among patients identified by the early warning system, there were no differences in the proportion of patients who were transferred to the ICU or who died in the intervention group as compared with the control group. CONCLUSIONS Real-time alerts were highly specific for clinical deterioration resulting in ICU transfer and death, and were associated with longer hospital length of stay. However, an intervention notifying a nurse of the risk did not result in improvement in these outcomes.


The Joint Commission Journal on Quality and Patient Safety | 2009

Clinical validation of the AHRQ postoperative venous thromboembolism patient safety indicator.

Katherine E. Henderson; Angela Recktenwald; Richard M. Reichley; Thomas C. Bailey; Brian Waterman; Rebecca L. Diekemper; Storey P; Belinda Ireland; Wm. Claiborne Dunagan

BACKGROUND The Agency for Healthcare Research and Quality (AHRQ) patient safety indicators (PSIs) screen for potentially preventable complications in hospitalized patients using hospital administrative data. The PSI for postoperative venous thromboembolism (VTE) relies on International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes for deep vein thrombosis (DVT) or pulmonary embolism (PE) in secondary diagnoses fields. In a clinical validation study of the PSI for postoperative VTE, natural language processing (NLP), supplemented by pharmacy and billing data, was used to identify VTE events missed by medical records coders. METHODS In a retrospective review of postsurgical discharges, charts were processed using the AHRQ PSI software. Cases were identified as possible false negatives by flagging charts for possible VTEs using pharmacy and billing data to identify all patients who were therapeutically anticoagulated or had placement of an inferior vena caval filter. All charts were reviewed by a physician blinded to screening results. Physician interpretation was considered the gold standard for VTE classification. RESULTS The AHRQ PSI had a positive predictive value (PPV) of .545 (95% confidence interval [CI], .453-.634) and a negative predictive value (NPV) of .997 (95% CI, .995-.999). Sensitivity was .87 and specificity was .98. Secondary coding review suggested that all 9 false-negative results were miscoded; if they had been properly coded, the sensitivity would increase to 1.00. Most false-positive cases resulted from superficial venous clots identified by the PSI due to coding ambiguity. DISCUSSION The VTE PSI performed well as a screening tool but generated a significant number of false-positive cases, a problem that could be substantially reduced with improved coding methods.


Infection Control and Hospital Epidemiology | 1994

Screening of physicians for tuberculosis.

Victoria J. Fraser; Charles Kilo; Thomas C. Bailey; Gerald Medoff; Wm. Claiborne Dunagan

OBJECTIVE To determine the prevalence of tuberculous infection among a sample of physicians at Barnes Hospital and to determine the frequency of tuberculin skin testing and the adequacy of follow-up for physicians with positive tuberculin skin tests. DESIGN Convenience sample. SETTING 1,000-bed, university-affiliated tertiary care hospital. SUBJECTS Physicians attending departmental conferences were screened for tuberculosis. Prior history of tuberculosis, antituberculous therapy, BCG vaccination, and previous tuberculin skin test results were obtained with a standardized questionnaire. Tuberculin skin tests were performed on those who were previously skin-test negative. OUTCOME MEASURE Tuberculosis infection, prophylactic therapy. RESULTS Eighty-six (24.5%) of 351 physicians in the study were skin test positive by history or currently performed skin test. Of 61 who reported a previously reactive skin test, 40 (66%) had been eligible for isoniazid prophylaxis, but only 15 (37.5%) of 40 had completed at least six months of therapy. Of 290 physicians reporting a previously negative skin test, 25 conversions (8.6%) were identified. Previously undiagnosed, asymptomatic pulmonary tuberculosis was identified in one physician. CONCLUSIONS Infection with Mycobacterium tuberculosis is common among physicians. Physicians were screened irregularly for tuberculosis, and the use of prophylactic therapy was inconsistent. Aggressive tuberculosis screening programs for healthcare workers should be instituted (Infect Control Hosp Epidemiol 1994;15:95-100).


Journal of the American Medical Informatics Association | 2009

Computerized Surveillance for Adverse Drug Events in a Pediatric Hospital

Peter M. Kilbridge; Laura A. Noirot; Richard M. Reichley; Kathleen M. Berchelmann; Cortney Schneider; Kevin M. Heard; Miranda Nelson; Thomas C. Bailey

There are limited data on adverse drug event rates in pediatrics. The authors describe the implementation and evaluation of an automated surveillance system modified to detect adverse drug events (ADEs) in pediatric patients. The authors constructed an automated surveillance system to screen admissions to a large pediatric hospital. Potential ADEs identified by the system were reviewed by medication safety pharmacists and a physician and scored for causality and severity. Over the 6 month study period, 6,889 study children were admitted to the hospital for a total of 40,250 patient-days. The ADE surveillance system generated 1226 alerts, which yielded 160 true ADEs. This represents a rate of 2.3 ADEs per 100 admissions or 4 per 1,000 patient-days. Medications most frequently implicated were diuretics, antibiotics, immunosuppressants, narcotics, and anticonvulsants. The composite positive predictive value of the ADE surveillance system was 13%. Automated surveillance can be an effective method for detecting ADEs in hospitalized children.


Infection Control and Hospital Epidemiology | 2008

Automated Surveillance for Central Line-Associated Bloodstream Infection in Intensive Care Units

Keith F. Woeltje; Anne M. Butler; Ashleigh J. Goris; Nhial T. Tutlam; Joshua A. Doherty; M. Brandon Westover; Vicky Ferris; Thomas C. Bailey

OBJECTIVE To develop and evaluate computer algorithms with high negative predictive values that augment traditional surveillance for central line-associated bloodstream infection (CLABSI). SETTING Barnes-Jewish Hospital, a 1,250-bed tertiary care academic hospital in Saint Louis, Missouri. METHODS We evaluated all adult patients in intensive care units who had blood samples collected during the period from July 1, 2005, to June 30, 2006, that were positive for a recognized pathogen on culture. Each isolate recovered from culture was evaluated using the definitions for nosocomial CLABSI provided by the National Healthcare Safety Network of the Centers for Disease Control and Prevention. Using manual surveillance by infection prevention specialists as the gold standard, we assessed the ability of various combinations of dichotomous rules to determine whether an isolate was associated with a CLABSI. Sensitivity, specificity, and predictive values were calculated. RESULTS Infection prevention specialists identified 67 cases of CLABSI associated with 771 isolates recovered from blood samples. The algorithms excluded approximately 40%-62% of the isolates from consideration as possible causes of CLABSI. The simplest algorithm, with 2 dichotomous rules (ie, the collection of blood samples more than 48 hours after admission and the presence of a central venous catheter within 48 hours before collection of blood samples), had the highest negative predictive value (99.4%) and the lowest specificity (44.2%) for CLABSI. Augmentation of this algorithm with rules for common skin contaminants confirmed by another positive blood culture result yielded in a negative predictive value of 99.2% and a specificity of 68.0%. CONCLUSIONS An automated approach to surveillance for CLABSI that is characterized by a high negative predictive value can accurately identify and exclude positive culture results not representing CLABSI from further manual surveillance.


Surgery | 1997

Impact of cytomegalovirus serology on graft survival in living related kidney transplantation: Implications for donor selection

Mark A. Schnitzler; Robert S. Woodward; Daniel C. Brennan; Edward L. Spitznagel; William Claiborne Dunagan; Thomas C. Bailey

BACKGROUND The impact of cytomegalovirus in living related kidney transplantation remains controversial. This study considers the implications of donor and recipient cytomegalovirus (CMV) serology for the selection of living related donor. METHODS Graft survival was estimated by using the bivariate Kaplan-Meier method and multivariate Cox proportional hazards analysis for 7659 living related first transplantations performed in the United States between 1989 and 1994. The effects of donor CMV serology were estimated with respect to recipient CMV serology and compared with human leukocyte antigen (HLA) matching, transplantation, donor, and recipient characteristics. The implications of these estimates for the selection of living related donors were considered. RESULTS From Kaplan-Meier estimates, donor CMV-seropositive kidneys were associated with significantly reduced graft survival for CMV-seronegative recipients (p = 0.0002) but not CMV-seropositive recipients (p = 0.1623). These findings were verified by use of Cox proportional hazards analysis accounting for covariate factors. The impact of donor CMV-seropositive kidneys on CMV-seronegative recipients was similar to one HLA-DR match, greater than one HLA-B match, and significantly greater than one HLA-A match (p = 0.0331). CONCLUSIONS Results identify donor CMV serology as an important determinant of transplantation outcome for living related first kidney transplant recipients who are themselves CMV seronegative. Consideration should be given to donor and recipient CMV serology when selecting an appropriate donor for living related kidney transplantation.

Collaboration


Dive into the Thomas C. Bailey's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Laura A. Noirot

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Wm. Claiborne Dunagan

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Victoria J. Fraser

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Chenyang Lu

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

W. Claiborne Dunagan

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Yixin Chen

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge