Matthew F. Niedner
University of Michigan
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Matthew F. Niedner.
Pediatrics | 2011
Marlene R. Miller; Matthew F. Niedner; W. Charles Huskins; Elizabeth Colantuoni; Gayane Yenokyan; Michele Moss; Tom B. Rice; Debra Ridling; Deborah Campbell; Richard J. Brilli
OBJECTIVES: To evaluate the long-term impact of pediatric central line care practices in reducing PICU central line–associated bloodstream infection (CLA-BSI) rates and to evaluate the added impact of chlorhexidine scrub and chlorhexidine-impregnated sponges. METHODS: A 3-year, multi-institutional, interrupted time-series design (October 2006 to September 2009), with historical control data, was used. A nested, 18-month, nonrandomized, factorial design was used to evaluate 2 additional interventions. Twenty-nine PICUs were included. Two central line care bundles (insertion and maintenance bundles) and 2 additional interventions (chlorhexidine scrub and chlorhexidine-impregnated sponges) were used. CLA-BSI rates (January 2004 to September 2009), insertion and maintenance bundle compliance rates (October 2006 to September 2009), and chlorhexidine scrub and chlorhexidine-impregnated sponge compliance rates (January 2008 to June 2009) were assessed. RESULTS: The average aggregate baseline PICU CLA-BSI rate decreased 56% over 36 months from 5.2 CLA-BSIs per 1000 line-days (95% confidence interval [CI]: 4.4–6.2 CLA-BSIs per 1000 line-days) to 2.3 CLA-BSIs per 1000 line-days (95% CI: 1.9–2.9 CLA-BSIs per 1000 line-days) (rate ratio: 0.44 [95% CI: 0.37–0.53]; P < .0001). No statistically significant differences in CLA-BSI rate decreases between PICUs using or not using either of the 2 additional interventions were found. CONCLUSIONS: Focused attention on consistent adherence to the use of pediatrics-specific central line insertion and maintenance bundles produced sustained, continually decreasing PICU CLA-BSI rates. Additional use of either chlorhexidine for central line entry scrub or chlorhexidine-impregnated sponges did not produce any statistically significant additional reduction in PICU CLA-BSI rates.
Science Translational Medicine | 2015
Robert J. Morrison; Scott J. Hollister; Matthew F. Niedner; Maryam Ghadimi Mahani; Albert H. Park; Deepak Mehta; Richard G. Ohye; Glenn E. Green
Patient-specific, image-based design coupled with 3D biomaterial printing produced personalized implants for treatment of collapsed airways in patients with tracheobronchomalacia. Printing in 4D: Personalized implants The 3D printing revolution is in full swing, with frequent reports of printed kidneys and jaws, dolls and cars, food, and body armor. The new challenge is to make 3D materials evolve in the fourth dimension: time. Such “4D” materials could change in response to temperature, light, or even stress, making them adaptable and enduring. In pediatric medicine, 4D implants become particularly relevant; as the patient grows, so, too, should the material. Morrison et al. used 3D printing technology with a safe, bioresorbable polymer blend to create splints for three pediatric patients with tracheobronchomalacia (TBM)—a condition of excessive collapse of the airways during normal breathing. Currently available fixed-size implants can migrate and require frequent resizing. Thus, the authors used imaging and computational models to design the splints for each TBM patient’s individual geometries, structuring the implants to accommodate airway growth and prevent external compression over a period of time, before being resorbed by the body. In all three patients (one with two airways splinted), the 4D devices were implanted without issue. All four implants were stable and functional after 1 month, and one implant has remained in place, keeping the airway open for over 3 years. This pilot trial demonstrates that the fourth dimension is a reality for 3D-printed materials, and with continued human studies, 4D biomaterials promise to change the way we envision the next generation of regenerative medicine. Three-dimensional (3D) printing offers the potential for rapid customization of medical devices. The advent of 3D-printable biomaterials has created the potential for device control in the fourth dimension: 3D-printed objects that exhibit a designed shape change under tissue growth and resorption conditions over time. Tracheobronchomalacia (TBM) is a condition of excessive collapse of the airways during respiration that can lead to life-threatening cardiopulmonary arrests. We demonstrate the successful application of 3D printing technology to produce a personalized medical device for treatment of TBM, designed to accommodate airway growth while preventing external compression over a predetermined time period before bioresorption. We implanted patient-specific 3D-printed external airway splints in three infants with severe TBM. At the time of publication, these infants no longer exhibited life-threatening airway disease and had demonstrated resolution of both pulmonary and extrapulmonary complications of their TBM. Long-term data show continued growth of the primary airways. This process has broad application for medical manufacturing of patient-specific 3D-printed devices that adjust to tissue growth through designed mechanical and degradation behaviors over time.
BMJ Quality & Safety | 2016
D Goodman; G Ogrinc; L Davies; Gr Baker; Jane Barnsteiner; Tc Foster; K Gali; J Hilden; Leora I. Horwitz; Heather C. Kaplan; Jerome A. Leis; Jc Matulis; Susan Michie; R Miltner; J Neily; William A. Nelson; Matthew F. Niedner; B Oliver; Lori Rutman; Richard Thomson; Johan Thor
Since its publication in 2008, SQUIRE (Standards for Quality Improvement Reporting Excellence) has contributed to the completeness and transparency of reporting of quality improvement work, providing guidance to authors and reviewers of reports on healthcare improvement work. In the interim, enormous growth has occurred in understanding factors that influence the success, and failure, of healthcare improvement efforts. Progress has been particularly strong in three areas: the understanding of the theoretical basis for improvement work; the impact of contextual factors on outcomes; and the development of methodologies for studying improvement work. Consequently, there is now a need to revise the original publication guidelines. To reflect the breadth of knowledge and experience in the field, we solicited input from a wide variety of authors, editors and improvement professionals during the guideline revision process. This Explanation and Elaboration document (E&E) is a companion to the revised SQUIRE guidelines, SQUIRE 2.0. The product of collaboration by an international and interprofessional group of authors, this document provides examples from the published literature, and an explanation of how each reflects the intent of a specific item in SQUIRE. The purpose of the guidelines is to assist authors in writing clearly, precisely and completely about systematic efforts to improve the quality, safety and value of healthcare services. Authors can explore the SQUIRE statement, this E&E and related documents in detail at http://www.squire-statement.org.
BMJ Quality & Safety | 2011
Katherine M Abstoss; Brenda E. Shaw; Tonie Owens; Julie Juno; Elaine Commiskey; Matthew F. Niedner
Objective This study analyses patterns in reporting rates of medication errors, rates of medication errors with harm, and responses to the Safety Attitudes Questionnaire (SAQ), all in the context of four cultural and three system-level interventions for medication safety in an intensive care unit. Methods Over a period of 2.5 years (May 2007 to November 2009), seven overlapping interventions to improve medication safety and reporting were implemented: a poster tracking ‘days since last medication error resulting in harm’, a continuous slideshow showing performance metrics in the staff lounge, multiple didactic curricula, unit-wide emails summarising medication errors, computerised physician order entry, introduction of unit-based pharmacy technicians for medication delivery, and patient safety report form streamlining. The reporting rate of medication errors and errors with harm were analysed over time using statistical process control. SAQ responses were collected annually. Results Subsequent to the interventions, the reporting rate of medication errors increased 25%, from an average of 3.16 to 3.95 per 10 000 doses dispensed (p<0.09), while the rate of medication errors resulting in harm decreased 71%, from an average of 0.56 to 0.16 per 10 000 doses dispensed (p<0.01). The SAQ showed improvement in all 13 survey items related to medication safety, five of which were significant (p<0.05). Conclusion Actively developing a transparent and positive safety culture at the unit level can improve medication safety. System-level mechanisms to promote medication safety are likely important factors that enable safety culture to translate into better outcomes, but may be independently ineffective in the face of poor safety culture.
Infection Control and Hospital Epidemiology | 2011
Matthew F. Niedner; W. Charles Huskins; Elizabeth Colantuoni; John Muschelli; J. Mitchell Harris; Tom B. Rice; Richard J. Brilli; Marlene R. Miller
OBJECTIVE Describe central line-associated bloodstream infection (CLA-BSI) epidemiology in pediatric intensive care units (PICUs). DESIGN Descriptive study (29 PICUs); cohort study (18 PICUs). SETTING PICUs in a national improvement collaborative. PATIENTS/PARTICIPANTS Patients admitted October 2006 to December 2007 with 1 or more central lines. METHODS CLA-BSIs were prospectively identified using the National Healthcare Safety Network definition and then readjudicated using the revised 2008 definition. Risk factors for CLA-BSI were examined using age-adjusted, time-varying Cox proportional hazards models. RESULTS In the descriptive study, the CLA-BSI incidence was 3.1/1,000 central line-days; readjudication with the revised definition resulted in a 17% decrease. In the cohort study, the readjudicated incidence was 2.0/1,000 central line-days. Ninety-nine percent of patients were CLA-BSI-free through day 7, after which the daily risk of CLA-BSI doubled to 0.27% per day. Compared with patients with respiratory diagnoses (most prevalent category), CLA-BSI risk was higher in patients with gastrointestinal diagnoses (hazard ratio [HR], 2.7 [95% confidence interval {CI}, 1.43-5.16]; P < .002 ) and oncologic diagnoses (HR, 2.6 [CI, 1.06-6.45]; P = .037). Among all patients, including those with more than 1 central line, CLA-BSI risk was lower among patients with a central line inserted in the jugular vein (HR, 0.43 [CI, 0.30-0.95]; [P < .03). CONCLUSIONS The 2008 CLA-BSI definition change decreased the measured incidence. The daily CLA-BSI risk was very low in patients during the first 7 days of catheterization but doubled thereafter. The risk of CLA-BSI was lower in patients with lines inserted in the jugular vein and higher in patients with gastrointestinal and oncologic diagnoses. These patients are target populations for additional study and intervention.
Congenital Heart Disease | 2010
Matthew F. Niedner; Jennifer Foley; Robert H. Riffenburgh; David P. Bichell; Bradley M. Peterson; Alexander Rodarte
OBJECTIVE B-type natriuretic peptide (BNP) has diagnostic, prognostic, and therapeutic roles in adults with heart failure. BNP levels in children undergoing surgical repair of congenital heart disease (CHD) were characterized broadly, and distinguishable subgroup patterns delineated. DESIGN Prospective, blinded, observational case series. SETTING Academic, tertiary care, free-standing pediatric hospital. PATIENTS Children with CHD; controls without cardiopulmonary disease. Interventions. None. MEASUREMENTS Preoperative cardiac medications/doses, CHD lesion types, perioperative BNP levels, intraoperative variables (lengths of surgery, bypass, cross-clamp), postoperative outcomes (lengths of ventilation, hospitalization, open chest; averages of inotropic support, central venous pressure, perfusion, urine output; death, low cardiac output syndrome (LCOS), cardiac arrest; readmission; and discharge medications). RESULTS Median BNP levels for 102 neonatal and non-neonatal controls were 27 and 7 pg/mL, respectively. Serial BNP measures from 105 patients undergoing CHD repair demonstrated a median postoperative peak at 12 hours. The median and interquartile postoperative 24-hour average BNP levels for neonates were 1506 (782-3784) pg/mL vs. 286 (169-578) pg/mL for non-neonates (P < 0.001). Postoperative BNP correlated with inotropic requirement, durations of open chest, ventilation, intensive care unit stay, and hospitalization (r = 0.33-0.65, all P < 0.001). Compared with biventricular CHD, Fontan palliations demonstrated lower postoperative BNP (median 150 vs. 306 pg/mL, P < 0.001), a 3-fold higher incidence of LCOS (P < 0.01), and longer length of hospitalization (median 6.0 vs. 4.5 days, P= 0.01). CONCLUSIONS Perioperative BNP correlates to severity of illness and lengths of therapy in the CHD population, overall. Substantial variation in BNP across time as well as within and between CHD lesions limits its practical utility as an isolated point-of-care measure. BNP commonly peaks 6-12 hours postoperatively, but the timing and magnitude of BNP elevation demonstrates notable age-dependency, peaking earlier and rising an order of magnitude higher in neonates. In spite of higher clinical acuity, non-neonatal univentricular CHD paradoxically demonstrates lower BNP levels compared with biventricular physiologies.
Pediatric Blood & Cancer | 2013
Sung W. Choi; Lawrence Chang; David A. Hanauer; Jacqueline Shaffer-Hartman; Daniel H. Teitelbaum; Ian Lewis; Alex Blackwood; Nur Akcasu; Janell Steel; Joy Christensen; Matthew F. Niedner
Pediatric hematology–oncology (PHO) patients are at significant risk for developing central line‐associated bloodstream infections (CLA‐BSIs) due to their prolonged dependence on such catheters. Effective strategies to eliminate these preventable infections are urgently needed. In this study, we investigated the implementation of bundled central line maintenance practices and their effect on hospital‐acquired CLA‐BSIs.
BMJ Quality & Safety | 2011
Ali A Cheema; Annette M Scott; Karen J Shambaugh; Jacqueline Shaffer-Hartman; Ronald E. Dechert; Susan M. Hieber; John Gosbee; Matthew F. Niedner
Objective To describe the washout effect after stopping a prevention checklist for ventilator-associated pneumonia (VAP). Methods VAP rates were prospectively monitored for special cause variation over 42 months in a paediatric intensive care unit. A VAP prevention bundle was implemented, consisting of head of bed elevation, oral care, suctioning device management, ventilator tubing care, and standard infection control precautions. Key practices of the bundle were implemented with a checklist and subsequently incorporated into the nursing and respiratory care bedside flow sheets to achieve long-term sustainability. Compliance with the VAP bundle was monitored throughout. The timeline for the project was retrospectively categorised into the benchmark phase, the checklist phase (implementation), the checklist washout phase, and the flowsheet phase (cues in the flowsheet). Results During the checklist phase (12 months), VAP bundle compliance rose from <50% to >75% and the VAP rate fell from 4.2 to 0.7 infections per 1000 ventilator days (p<0.059). Unsolicited qualitative feedback from frontline staff described overburdensome documentation requirements, form fatigue, and checklist burnout. During the checklist washout phase (4 months), VAP rates rose to 4.8 infections per 1000 ventilator days (p<0.042). In the flowsheet phase, the VAP rate dropped to 0.8 infections per 1000 ventilator days (p<0.047). Conclusions Salient cues to drive provider behaviour towards best practice are helpful to sustain process improvement, and cessation of such cues should be approached warily. Initial education, year-long habit formation, and effective early implementation demonstrated no appreciable effect on the VAP rate during the checklist washout period.
Pediatric Critical Care Medicine | 2013
Theresa Mottes; Tonie Owens; Matthew F. Niedner; Julie Juno; Thomas P. Shanley; Michael Heung
Purpose: To describe our experience with transitions in both nursing model and educational training program for delivery of continuous renal replacement therapy. There have been very few comparisons between different care and educational models, and the optimal approach remains uncertain. In particular, we evaluated our experience with introducing a simulation-based educational model. Design: Prospective quality control observational study. Setting: The ICU of a tertiary care pediatric referral center. Patients: All patients undergoing CRRT between July 2007 through July 2010 were included. Measurements and Main Results: We monitored CRRT filter life during a transition from a collaborative to critical care nursing model, and subsequently during a transition from a didactic education program to simulation-based training. During the study period, 80 patients underwent continuous renal replacement therapy with use of 343 filters. Process control charts demonstrated a significant increase in filter life and a decrease in unplanned filter changes. Both of these signals emerged at the same time and corresponded with the introduction of the simulation-based education program. Further statistical analysis showed that filter life improved from 42.5 hours (18.2–66.4 hr) during the didactic education program to 59.4 hours (22.2–76.4 hr) during the simulation-based education program (p = 0.008). This relationship persisted when excluding nonpreventable premature filter discontinuations and in a multivariate model that accounted for other potential influences on filter life. Conclusions: We report on the impact of transitioning between different educational programs for continuous renal replacement therapy, specifically with the introduction of a simulation-based approach. We observed a significant and sustained improvement in the delivery of continuous renal replacement therapy as demonstrated by a marked increase in filter lifespan.
Pediatrics | 2015
Anastasia K. Ketko; Craig M. Martin; Michelle Nemshak; Matthew F. Niedner; Rebecca Vartanian
BACKGROUND: After the implementation of narrowed oxygen saturation alarms, alarm frequency increased in the C.S. Mott Childrens Hospital NICU which could have a negative impact on patient safety. The Joint Commission on the Accreditation of Healthcare Organizations issued a Sentinel Event Alert for hospitals in 2013 to improve alarm safety, resulting in a 2014 National Patient Safety Goal requiring institutional policies and procedures to be in place to manage alarms. METHODS: A multidisciplinary improvement team developed an alarm management bundle applying strategies to decrease alarm frequency, which included evaluating existing strategies and developing patient care–based and systems-based interventions. The total number of delivered and detected saturation alarms and high saturation alarms and the total time spent within a targeted saturation range were quantitatively tracked. Nursing morale was assessed qualitatively. RESULTS: SpO2 alarms per monitored patient-day increased from 78 to 105 after the narrowing of alarm limits. Modification of the high saturation alarm algorithm substantially decreased the delivery and escalation of high pulse oxygen saturation (SpO2) alarms. During a pilot period, using histogram technology to individually customize alarm limits resulted in increased time spent within the targeted saturation range and fewer alarms per day. Qualitatively, nurses reported improved satisfaction when not assigned >1 infant with frequent alarms, as identified by an alarm frequency tool. CONCLUSIONS: Alarm fatigue may detrimentally affect patient care and safety. Alarm management strategies should coincide with oxygen management within a NICU, especially in single-patient-bed units.