Todd A. Miano
University of Pennsylvania
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Todd A. Miano.
Anesthesia & Analgesia | 2016
Michael Fabbro; Jacob T. Gutsche; Todd A. Miano; John G.T. Augoustides; Prakash A. Patel
BACKGROUND: The inflated costs and documented deleterious effects of excess perioperative transfusion have led to the investigation of targeted coagulation factor replacement strategies. One particular coagulation factor of interest is factor I (fibrinogen). Hypofibrinogenemia is typically tested for using time-consuming standard laboratory assays. The thrombelastography (TEG)-based functional fibrinogen level (FLEV) provides an assessment of whole blood clot under platelet inhibition to report calculated fibrinogen levels in significantly less time. If FLEV values obtained on cardiopulmonary bypass (CPB) during rewarming are similar to values obtained immediately after the discontinuation of CPB, then rewarming values could be used for preemptive ordering of appropriate blood product therapy. METHODS: Fifty-one cardiac surgery patients were enrolled into this prospective nonrandomized study to compare rewarming fibrinogen values with postbypass values using TEG FLEV assays. Baseline, rewarming, and postbypass fibrinogen values were recorded for all patients using both standard laboratory assay (Clauss method) and FLEV. Mixed-effects regression models were used to examine the change in TEG FLEV values over time. Bland-Altman analysis was used to examine bias and the limits of agreement (LOA) between the standard laboratory assay and FLEVs. RESULTS: Forty-nine patients were included in the analysis. The mean FLEV value during rewarming was 333.9 mg/dL compared with 332.8 mg/dL after protamine, corresponding to an estimated difference of −1.1 mg/dL (95% confidence interval [CI], −25.8 to 23.6; P = 0.917). Rewarming values were available on average 47 minutes before postprotamine values. Bland-Altman analysis showed poor agreement between FLEV and standard assays: mean difference at baseline was 92.5 mg/dL (95% CI, 71.1 to 114.9), with a lower LOA of −56.5 mg/dL (95% CI, −94.4 to −18.6) and upper LOA of 242.4 mg/dL (95% CI, 204.5 to 280.3). The difference between assays increased after CPB and persisted after protamine administration. CONCLUSIONS: Our results revealed negligible change in FLEV values from the rewarming to postbypass periods, with a CI that does not include clinically meaningful differences. These findings suggest that rewarming samples could be utilized for ordering fibrinogen-specific therapies before discontinuation of CPB. Mean FLEV values were consistently higher than the reference standard at each time point. Moreover, bias was highly heterogeneous among samples, implying a large range of potential differences between assays for any 1 patient.
Anesthesia & Analgesia | 2017
Jacob T. Gutsche; Mark E. Mikkelsen; Fenton H. McCarthy; Todd A. Miano; William J. Vernick; Harish Ramakrishna; Prakash A. Patel; Yianni Augoustides; Wilson Y. Szeto; Nimesh D. Desai; Meghan B. Lane-Fall; Matthew L. Williams
When clinicians consider extracorporeal life support (ECLS) for acute respiratory distress syndrome (ARDS) patients with hemodynamic instability, both veno-arterial (VA) and veno-venous (VV) ECLS are therapeutic possibilities. We analyzed 17 patients with ARDS on inotropic or vasopressor support requiring ECLS for refractory hypoxemia. After implementing VV ECLS, pressor requirements (based on norepinephrine equivalents) were significantly lower in all patients (P = .0001 for overall comparison across time points). None of the 17 patients required conversion from VV ECLS to VA ECLS (95% confidence interval 0%–20.0%). In this sample of 17 patients with substantial baseline vasopressor support and hypoxemic respiratory failure, initiation of VV ECLS was associated with reduced pressor requirements. Such a strategy may help avoid complications of VA ECLS in patients with both respiratory and hemodynamic failure.
Clinical Journal of The American Society of Nephrology | 2018
Todd A. Miano; Ebbing Lautenbach; F. Perry Wilson; Wensheng Guo; Yuliya Borovskiy; Sean Hennessy
BACKGROUND AND OBJECTIVES Despite colistins longstanding reported association with nephrotoxicity, the attributable risk and timing of toxicity onset are still unknown. Whether substantial toxicity occurs during the initial 72 hours of exposure has important implications for early treatment decisions. The objective of this study was to compare colistin-exposed patients with a matched control group given other broad spectrum antibiotics. DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS We conducted a retrospective cohort study in patients treated for multidrug-resistant Pseudomonas, Klebsiella, or Acinetobacter spp. Colistin-exposed patients were matched to unexposed controls using propensity scores. AKI was defined according to the Kidney Disease Improving Global Outcomes creatinine criteria. Incidence rate ratios and risk differences of AKI in the matched cohort were estimated with the generalized estimating equation Poisson regression model. Risk factors for AKI were tested for effect modification in the matched cohort. RESULTS The study included 150 propensity-matched pairs with similar types of infection, similar delays to effective treatment, and similar baseline characteristics. Incidence of AKI was 77 of 150 (51%) in the colistin group versus 33 of 150 (22%) in matched controls (risk difference, 29%; 95% confidence interval, 19 to 39), corresponding to a number needed to harm of 3.5. Early toxicity was apparent, because AKI risk was higher in colistin-exposed patients at 72 hours of exposure (incidence rate ratio, 1.9; 95% confidence interval, 1.1 to 3.5). In both groups, hospital mortality in patients who experienced AKI was lower if kidney function returned to baseline during hospitalization. The effect of colistin exposure on AKI risk varied inversely according to baseline hemoglobin concentration. CONCLUSIONS Colistin is associated with substantial excess AKI that is apparent within the first 72 hours of treatment. Colistins toxicity varied according to baseline hemoglobin concentration. PODCAST This article contains a podcast at https://www.asn-online.org/media/podcast/CJASN/2018_03_15_CJASNPodcast_18_4_M.mp3.
JAMA Ophthalmology | 2017
Katherine E. Uyhazi; Todd A. Miano; Wei Pan; Brian L. VanderBeek
Importance Novel oral anticoagulation and antiplatelet therapies have become a mainstay of treatment for thromboembolic disease. However, the safety profile of these medications has not been completely characterized. Objective To determine the risk of developing intraocular hemorrhages with novel oral antithrombotic therapy compared with that of traditional antithrombotic agents. Design, Setting, and Participants In this retrospective cohort study, a large national insurance claims database was used to generate 2 parallel analyses. All patients with incident use of dabigatran etexilate or rivaroxaban between January 1, 2010, and September 30, 2015, were compared with patients with incident use of warfarin sodium. Similarly, patients with new use of prasugrel hydrochloride were compared with those with new use of clopidogrel bisulfate. Both analyses required the patient to be in the insurance plan for at least 24 months prior to initiation of therapy and excluded patients with any previous diagnosis of intraocular hemorrhages or any prescription for the comparator medications. Furthermore, the antiplatelet analysis required a diagnosis of acute coronary syndrome or a myocardial infarction within 60 days of initiation of pharmacologic therapy. The anticoagulant analysis excluded patients with end-stage renal disease, renal transplants, and those with heart valve disease. Main Outcomes and Measures Incident intraocular hemorrhages at 90 and 365 days. Multivariate Cox proportional hazards regression models were used to compare the hazard ratio (HR) of developing an intraocular hemorrhage in individuals taking novel agents compared with those taking traditional medications. Results A total of 146 137 patients taking warfarin (76 714 women and 69 423 men; mean [SD] age, 69.8 [11.8] years) were compared with 64 291 patients taking dabigatran or rivaroxaban (31 576 women and 32 715 men; mean [SD] age, 67.6 [11.7] years). Cox proportional hazards regression revealed a decreased hazard for developing an intraocular hemorrhage with dabigatran or rivaroxaban at 365 days (HR, 0.75; 95% CI, 0.58-0.97; P = .03), but not at 90 days (HR, 0.73; 95% CI, 0.22-2.63; P = .13). A total of 103 796 patients taking clopidogrel (37 578 women and 66 218 men; mean [SD] age, 68.0 [11.3] years) were compared with 8386 patients taking prasugrel (1988 women and 6380 men; mean [SD] age, 61.0 [9.6] years) and no increased hazard for developing an intraocular hemorrhage with prasugrel was seen at 90 days (HR, 0.75; 95% CI, 0.29-1.92; P = .55) or 365 days (HR, 1.19; 95% CI, 0.69-2.04; P = .53). Conclusions and Relevance These results suggest a decreased risk of intraocular hemorrhage associated with novel direct thrombin inhibitors and direct factor Xa inhibitors, but no difference for P2Y12 inhibitors compared with traditional vitamin K anticoagulation and antiplatelet therapy, respectively.
Dimensions of Critical Care Nursing | 2017
Juliane Jablonski; Jaime Robenolt Gray; Todd A. Miano; Gretchen Redline; Heather Teufel; Tara Collins; Jose Pascual-Lopez; Martha Sylvia; Niels D. Martin
Background: Societal guidelines exist for the management of pain, agitation, and delirium (PAD) in critically ill patients. This contemporary practice aims for a more awake and interactive patient. Institutions are challenged to translate the interrelated multivariable concepts of PAD into daily clinical practice and to demonstrate improvement in quality outcomes. An interdisciplinary goal-directed approach shows outcomes in high-acuity surgical critical care during the early stages of implementation. Methods: This study was a prospective preintervention and postintervention design. A formal PAD clinical practice guideline targeting standardized assessment and “light” levels of sedation was instituted. All mechanically ventilated patients admitted to a 24-bed surgical intensive care unit (ICU) at an academic medical center during a 6-month period were included (3 months before and 3 months after implementation). Sedation and agitation were measured using the Richmond Agitation Sedation Scale (RASS), pain measured using a Behavioral or Numeric Pain Scale (NPS/BPS), and delirium using the Confusion Assessment Method for the Intensive Care Unit. Total ventilator days with exposure to continuous opioid or sedative infusions and total ICU days where the patient received a physical activity session exercising out of bed were recorded. Results: There were 106 patients (54 at preintervention and 52 at postintervention). Mean percentage of RASS scores between 0 to −1 increased from 38% to 50% postintervention (P < .02). Mean percentage of NPS/BPS scores within the goal range (<5 for BPS and <3 for NPS) remained stable, 86% to 83% (P = .16). There was a decrease in use of continuous narcotic infusions for mechanically ventilated patients. This was reported as mean percentage of total ventilator days with a continuous opioid infusing: 65% before implementation versus 47% after implementation (P < .01). Mean percentage of ICU days with physical activity sessions increased from 24% to 41% (P < .001). Overall mean ventilator-free days and ICU length of stay were 5.4 to 4.5 days (P = .29) and 11.75 to 9.5 days (P = .20), respectively. Conclusion: Measureable patient outcomes are achievable in the early stages of PAD guideline initiatives and can inform future systems-level organizational change. Pain, agitation, and delirium assessment tools form the foundation for clinical implementation and evaluation. High-acuity surgical critical care patients can achieve more time at goal RASS, decreased ventilator days, and less exposure to continuous opioid infusions, all while maintaining stable analgesia.
Critical Care Medicine | 2017
Meghan B. Lane-Fall; Todd A. Miano; Jaya Aysola; John G.T. Augoustides
Objectives: Diversity in the physician workforce is essential to providing culturally effective care. In critical care, despite the high stakes and frequency with which cultural concerns arise, it is unknown whether physician diversity reflects that of critically ill patients. We sought to characterize demographic trends in critical care fellows, who represent the emerging intensivist workforce. Design: We used published data to create logistic regression models comparing annual trends in the representation of women and racial/ethnic groups across critical care fellowship types. Setting: United States Accreditation Council on Graduate Medical Education-approved residency and fellowship training programs. Subjects: Residents and fellows employed by Accreditation Council on Graduate Medical Education-accredited training programs from 2004 to 2014. Interventions: None. Measurements and Main Results: From 2004 to 2014, the number of critical care fellows increased annually, up 54.1% from 1,606 in 2004–2005 to 2,475 in 2013–2014. The proportion of female critical care fellows increased from 29.5% (2004–2005) to 38.3% (2013–2014) (p < 0.001). The absolute number of black fellows increased each year but the percentage change was not statistically significantly different (5.1% in 2004–2005 vs 3.9% in 2013–2014; p = 0.92). Hispanic fellows increased in number from 124 (7.7%) in 2004–2005 to 216 (8.4%) in 2013–2014 (p = 0.015). The number of American Indian/Alaskan Native/Native Hawaiian/Pacific Islander fellows decreased from 15 (1.0%) to seven (0.3%) (p < 0.001). When compared with population estimates, female critical care fellows and those from racial/ethnic minorities were underrepresented in all years. Conclusions: The demographics of the emerging critical care physician workforce reflect underrepresentation of women and racial/ethnic minorities. Trends highlight increases in women and Hispanics and stable or decreasing representation of non-Hispanic underrepresented minority critical care fellows. Further research is needed to elucidate the reasons underlying persistent underrepresentation of racial and ethnic minorities in critical care fellowship programs.
The Journal of Clinical Pharmacology | 2018
Yoonsun Mo; Michael C. Thomas; Todd A. Miano; Leo I. Stemp; Julia T. Bonacum; Kathleen Hutchins; George E. Karras
Modafinil therapy, a nonamphetamine cognition‐enhancing agent, holds the potential to improve recovery from cognitive impairment after intensive care unit (ICU) admission. To date, however, there is a paucity of data on modafinil use in the ICU setting. The purpose of this study was to explore the role of modafinil for improvement in cognition in ICU patients. This retrospective cohort study evaluated a total of 60 ICU patients with any ventilatory support who started on modafinil during their ICU stay from January 1, 2010, to March 19, 2016. The requirements of opioids and sedatives, as well as the lowest and average scores of the Glasgow Coma Scale (GCS) and Riker Sedation‐Agitation Scale (SAS), were recorded during 48 hours before and after the start of modafinil therapy in 6‐hour periods. The average daily modafinil dose of 170 mg was given for a median duration of 9 days. Modafinil administration was associated with a small, nonsignificant increase in GCS by 0.34 points after controlling for age, baseline severity of illness, and changes in sedation and analgesia over time (95%CI, −0.34 to 0.73 points; P = .0743). No major modafinil‐associated adverse effects were observed. Modafinil administration did not significantly improve cognitive function in ICU patients within 48 hours of initiation. However, because of lack of robust evidence, the impact of modafinil on overall patient outcomes in the ICU remains unclear and needs further investigation.
Chest | 2018
Todd A. Miano; Adam Cuker; Jason D. Christie; Niels D. Martin; Brian P. Smith; Amy T. Makley; Wensheng Guo; Sean Hennessy
Background Enoxaparin 30 mg twice daily and dalteparin 5,000 units once daily are two common low‐molecular‐weight heparin (LMWH) thromboprophylaxis regimens used in the trauma population. Pharmacodynamic studies suggest that enoxaparin provides more potent anticoagulation than does dalteparin. Methods In 2009, our institution switched its formulary LMWH from enoxaparin to dalteparin followed by a switch back to enoxaparin in 2013. Using a difference in differences design, we contrasted the change in the VTE rate accompanying the LMWH switch with the change in a control group of trauma patients given unfractionated heparin (UFH) during the same period. Results The study included 5,880 patients: enoxaparin period (enoxaparin, n = 2,371; UFH, n = 1,539) vs the dalteparin period (dalteparin, n = 1,046; UFH, n = 924). The VTE rate was unchanged in the LMWH group: 3.3/1000 days in the enoxaparin period vs 3.8/1000 days in the dalteparin period: rate ratio (RR), 1.16; 95% CI 0.74‐1.81. The rate was also unchanged in the UFH control subjects: 5.7/1,000 days in the enoxaparin period vs 5.2/1,000 days in the dalteparin period: RR, 0.92; 95% CI, 0.61‐1.38. After confounding adjustment, the ratio of the change in VTE rate between the LMWH and UFH groups was similar: RR, 1.06; 95% CI 0.71‐2.00. A secondary analysis excluding patients with delayed or interrupted prophylaxis (or both) altered this estimate nonsignificantly in favor of enoxaparin: RR, 2.39; 95% CI, 0.80‐7.09. Conclusions Our results suggest that dalteparin has an effectiveness similar to that of enoxaparin in real‐world trauma patients. Future research should investigate how the timing and consistency of prophylaxis affects LMWH effectiveness.
Journal of Cardiothoracic and Vascular Anesthesia | 2017
Jacob T. Gutsche; Todd A. Miano; William J. Vernick; Jesse M. Raiten; C. Bermudez; Prashant Vallabjoysula; Karianna Milewski; Wilson Y. Szeto; Meghan Lane Fall; Matthew L. Williams; Prakash A. Patel; Mark E. Mikkelsen; Cornel Chiu; Harish Ramakrishna; Jeremy Canon; John G.T. Augoustides
OBJECTIVE To understand if mobile extracorporeal membrane oxygenation reduces patient mortality during and after transport of patients requiring extracorporeal membrane oxygenation for acute respiratory distress syndrome. DESIGN Retrospective chart review. SETTING University affiliated tertiary care hospitals. PARTICIPANTS Seventy-seven patients. INTERVENTIONS Introduction of a mobile extracorporeal membrane oxygenation (ECMO) program designed to facilitate the implementation of ECMO at outside hospitals in patients too unstable for transport for ECMO. MEASUREMENTS AND MAIN RESULTS The 28-day in-hospital mortality was significantly lower in the post-mobile group (12/51 [23.5%] v 12/24 [50%], adjusted risk difference: 28.6%, [95% CI 4.7-52.5, p = 0.011]). CONCLUSIONS These findings suggest that patients with severe acute respiratory failure who require transport to a referral center for extracorporeal life support may benefit from the availability of a mobile extracorporeal life support team.
Journal of Antimicrobial Chemotherapy | 2015
Alex Ganetsky; Todd A. Miano; Mitchell E. Hughes; Robert H. Vonderheide; David L. Porter; Ran Reshef
OBJECTIVES Emerging data suggest that the combination of tacrolimus and the CCR5 antagonist maraviroc, both cytochrome P450-3A4 substrates, may be effective in preventing graft-versus-host disease in patients undergoing allogeneic HSCT. This study evaluated whether a pharmacokinetic interaction exists between these agents. METHODS The study included 36 allogeneic HSCT recipients who received maraviroc + tacrolimus and 43 recipients of tacrolimus alone. We used a difference-in-differences analysis to examine the change in the concentration/dose ratios of tacrolimus after the discontinuation of maraviroc. In addition, we analysed the concentrations and dose requirements of tacrolimus in the two groups. RESULTS There was no significant difference in tacrolimus concentration/dose ratios in patients receiving maraviroc + tacrolimus compared with tacrolimus alone. Upon discontinuation of maraviroc, the change in concentration/dose ratio was small and not significant relative to the control group, and the effect estimate was further attenuated after adjustment for confounders [-0.35 (ng/mL)/(mg/day); P = 0.46]. In addition, the change in mean tacrolimus dose after discontinuation of maraviroc was similar between the groups (0.12 mg/day; P = 0.56), as was the change in mean tacrolimus concentration (0.02 ng/mL; P = 0.97). CONCLUSIONS Our findings do not support a significant inhibitory effect of maraviroc on the metabolism of tacrolimus. These data demonstrate that this drug combination is safe and imply that the protective effect of maraviroc against graft-versus-host disease was not mediated through an increase in tacrolimus concentrations. These findings are important for the design of clinical trials that evaluate maraviroc in combination with cytochrome P450-3A4 substrates.