Todd C. Crawford
Johns Hopkins University School of Medicine
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Todd C. Crawford.
Journal of Controlled Release | 2017
Fan Zhang; J. Trent Magruder; Yi An Lin; Todd C. Crawford; Joshua C. Grimm; Christopher M. Sciortino; Mary Ann Wilson; Mary E. Blue; Sujatha Kannan; Michael V. Johnston; William A. Baumgartner; Rangaramanujam M. Kannan
&NA; Hypothermic circulatory arrest (HCA) provides neuroprotection during cardiac surgery but entails an ischemic period that can lead to excitotoxicity, neuroinflammation, and subsequent neurologic injury. Hydroxyl polyamidoamine (PAMAM) dendrimers target activated microglia and damaged neurons in the injured brain, and deliver therapeutics in small and large animal models. We investigated the effect of dendrimer size on brain uptake and explored the pharmacokinetics in a clinically‐relevant canine model of HCA‐induced brain injury. Generation 6 (G6, ˜6.7 nm) dendrimers showed extended blood circulation times and increased accumulation in the injured brain compared to generation 4 dendrimers (G4, ˜4.3 nm), which were undetectable in the brain by 48 h after final administration. High levels of G6 dendrimers were found in cerebrospinal fluid (CSF) of injured animals with a CSF/serum ratio of ˜20% at peak, a ratio higher than that of many neurologic pharmacotherapies already in clinical use. Brain penetration (measured by drug CSF/serum level) of G6 dendrimers correlated with the severity of neuroinflammation observed. G6 dendrimers also showed decreased renal clearance rate, slightly increased liver and spleen uptake compared to G4 dendrimers. These results, in a large animal model, may offer insights into the potential clinical translation of dendrimers. Graphical abstract Figure. No caption available.
The Annals of Thoracic Surgery | 2015
Joshua C. Grimm; Vicente Valero; Arman Kilic; Todd C. Crawford; John V. Conte; Christian A. Merlo; Pali D. Shah; Ashish S. Shah
BACKGROUND The aim of this study was to determine which factors predict poor postoperative performance and to evaluate the impact of these variables on 1-year mortality. METHODS The United Network for Organ Sharing database was queried for adult patients undergoing lung transplantation (LTx) from 2007 to 2011. Patients were divided based on their preoperative Karnofsky Performance Status score (KPS) into 3 groups. Regression analysis was conducted to determine which factors predicted poor postoperative performance. Cox modeling was utilized to identify which of these factors was associated with an increased risk of mortality after LTx. RESULTS Of the 7,832 patients included in this study, 30.1% required complete assistance, 57.7% required partial assistance, and 12.3% needed no assistance preoperatively. Postoperative KPS was assessed at a mean of 2.6 ± 1.5 years after transplant. A number of factors, including primary graft failure, redo and single LTx, and intensive care unit status prior to LTx independently predicted poor performance; whereas a body mass index 18.5 kg/m(2) or greater and some degree of preoperative functional independence were protective. Age greater than 60 years, donor tobacco use, and intensive care unit status, extracorporeal membrane oxygenation support, and mechanical ventilation prior to LTx were associated with an increased risk 1-year mortality, while preoperative functional independence and a body mass index 18.5 to 30 kg/m(2) were protective. CONCLUSIONS This is the largest known study to examine the issue of disability in LTx and its relationship to mortality. Preoperative performance status significantly impacts post-LTx mortality. Patient optimization may improve outcomes and should alter decisions regarding graft selection and allocation.
The Journal of Thoracic and Cardiovascular Surgery | 2017
J. Trent Magruder; Todd C. Crawford; Herbert Lynn Harness; Joshua C. Grimm; Alejandro Suarez-Pierre; Chad Wierschke; Jim Biewer; Charles W. Hogue; Glenn R. Whitman; Ashish S. Shah; Viachaslau Barodka
Background: We sought to determine whether a pilot goal‐directed perfusion initiative could reduce the incidence of acute kidney injury after cardiac surgery. Methods: On the basis of the available literature, we identified goals to achieve during cardiopulmonary bypass (including maintenance of oxygen delivery >300 mL O2/min/m2 and reduction in vasopressor use) that were combined into a goal‐directed perfusion initiative and implemented as a quality improvement measure in patients undergoing cardiac surgery at Johns Hopkins during 2015. Goal‐directed perfusion initiative patients were matched to controls who underwent cardiac surgery between 2010 and 2015 using propensity scoring across 15 variables. The primary and secondary outcomes were the incidence of acute kidney injury and the mean increase in serum creatinine within the first 72 hours after cardiac surgery. Results: We used the goal‐directed perfusion initiative in 88 patients and matched these to 88 control patients who were similar across all variables, including mean age (61 years in controls vs 64 years in goal‐directed perfusion initiative patients, P = .12) and preoperative glomerular filtration rate (90 vs 83 mL/min, P = .34). Controls received more phenylephrine on cardiopulmonary bypass (mean 2.1 vs 1.4 mg, P < .001) and had lower nadir oxygen delivery (mean 241 vs 301 mL O2/min/m2, P < .001). Acute kidney injury incidence was 23.9% in controls and 9.1% in goal‐directed perfusion initiative patients (P = .008); incidences of acute kidney injury stage 1, 2, and 3 were 19.3%, 3.4%, and 1.1% in controls, and 5.7%, 3.4%, and 0% in goal‐directed perfusion initiative patients, respectively. Control patients exhibited a larger median percent increase in creatinine from baseline (27% vs 10%, P < .001). Conclusions: The goal‐directed perfusion initiative was associated with reduced acute kidney injury incidence after cardiac surgery in this pilot study.
The Journal of Thoracic and Cardiovascular Surgery | 2017
Nishant D. Patel; Todd C. Crawford; J. Trent Magruder; Diane E. Alejo; Narutoshi Hibino; James H. Black; Harry C. Dietz; Luca A. Vricella; Duke E. Cameron
Objectives: Early experience with Loeys‐Dietz syndrome (LDS) suggested an aggressive aortopathy with high risk of aneurysm dissection and rupture at young ages and at smaller aortic diameters than in other connective tissue disorders. We reviewed our experience with LDS to re‐examine our indications and outcomes of surgical management. Methods: We reviewed all patients with a diagnosis of LDS who underwent cardiovascular surgery at our institution. The primary endpoint was mortality, and secondary endpoints included postoperative complications and need for reintervention. Results: Seventy‐nine operated patients with LDS were identified. Mean age at first operation was 25 years, 39 (49%) were female, and 38 (48%) were children (age <18 years). Six (8%) patients presented with acute dissection. Five (6%) patients had a bicuspid aortic valve, and all presented with an ascending aortic aneurysm with a mean root diameter of 3.5cm. Twenty (25%) patients had a previous sternotomy. Sixty‐five (82%) patients underwent aortic root replacement, of whom 52 underwent a valve‐sparing operation and 4 had concomitant arch replacement. Mean aortic root diameter in this group was 4.2 cm. Nine (11%) patients underwent aortic arch replacement, 2 (3%) had isolated ascending aorta replacement, and 3 (4%) underwent open thoracoabdominal repair. There were 2 (3%) operative and 8 late deaths. Nineteen patients underwent subsequent operations for late aneurysm and/or dissection. Mean follow‐up was 6 years (range 0‐24 years). Kaplan‐Meier survival was 88% at 10 years. Conclusions: Growing experience with LDS has confirmed early impressions of its aggressive nature and proclivity toward aortic catastrophe. Surgical outcomes are favorable, but reintervention rates are high. Meticulous follow‐up with cardiovascular surveillance imaging remain important for management, particularly as clinical LDS subtypes are characterized and more tailored treatment is developed.
The Annals of Thoracic Surgery | 2017
J. Trent Magruder; Elena Blasco-Colmenares; Todd C. Crawford; Diane Alejo; John V. Conte; Rawn Salenger; Clifford E. Fonner; Christopher C. Kwon; Jennifer Bobbitt; James M. Brown; Mark G. Nelson; Keith A. Horvath; Glenn R. Whitman
BACKGROUND Variation in red blood cell (RBC) transfusion practices exists at cardiac surgery centers across the nation. We tested the hypothesis that significant variation in RBC transfusion practices between centers in our states cardiac surgery quality collaborative remains even after risk adjustment. METHODS Using a multiinstitutional statewide database created by the Maryland Cardiac Surgery Quality Initiative (MCSQI), we included patient-level data from 8,141 patients undergoing isolated coronary artery bypass (CAB) or aortic valve replacement at 1 of 10 centers. Risk-adjusted multivariable logistic regression models were constructed to predict the need for any intraoperative RBC transfusion, as well as for any postoperative RBC transfusion, with anonymized center number included as a factor variable. RESULTS Unadjusted intraoperative RBC transfusion probabilities at the 10 centers ranged from 13% to 60%; postoperative RBC transfusion probabilities ranged from 16% to 41%. After risk adjustment with demographic, comorbidity, and operative data, significant intercenter variability was documented (intraoperative probability range, 4% -59%; postoperative probability range, 13%-39%). When stratifying patients by preoperative hematocrit quartiles, significant variability in intraoperative transfusion probability was seen among all quartiles (lowest quartile: mean hematocrit value, 30.5% ± 4.1%, probability range, 17%-89%; highest quartile: mean hematocrit value, 44.8% ± 2.5%; probability range, 1%-35%). CONCLUSIONS Significant variation in intercenter RBC transfusion practices exists for both intraoperative and postoperative transfusions, even after risk adjustment, among our states centers. Variability in intraoperative RBC transfusion persisted across quartiles of preoperative hematocrit values.
American Journal of Transplantation | 2017
Jonathan Trent Magruder; Todd C. Crawford; Joshua C. Grimm; Bo S. Kim; Ashish S. Shah; Errol L. Bush; Robert S.D. Higgins; Christian A. Merlo
Risk factors for non–skin cancer de novo malignancy (DNM) after lung transplantation have yet to be identified. We queried the United Network for Organ Sharing database for all adult lung transplant patients between 1989 and 2012. Standardized incidence ratios (SIRs) were computed by comparing the data to Surveillance, Epidemiology, and End Results Program data after excluding skin squamous/basal cell carcinomas. We identified 18 093 adult lung transplant patients; median follow‐up time was 1086 days (interquartile range 436–2070). DNMs occurred in 1306 patients, with incidences of 1.4%, 4.6%, and 7.9% at 1, 3, and 5 years, respectively. The overall cancer incidence was elevated compared with that of the general US population (SIR 3.26, 95% confidence interval [CI]: 2.95–3.60). The most common cancer types were lung cancer (26.2% of all malignancies, SIR 6.49, 95% CI: 5.04–8.45) and lymphoproliferative disease (20.0%, SIR 14.14, 95% CI: 9.45–22.04). Predictors of DNM following lung transplantation were age (hazard ratio [HR] 1.03, 95% CI: 1.02–1.05, p < 0.001), male gender (HR 1.20, 95% CI: 1.02–1.42, p = 0.03), disease etiology (not cystic fibrosis, idiopathic pulmonary fibrosis or interstitial lung disease, HR 0.59, 95% CI 0.37–0.97, p = 0.04) and single‐lung transplantation (HR 1.64, 95% CI: 1.34–2.01, p < 0.001). Significant interactions between donor or recipient smoking and single‐lung transplantation were noted. On multivariable survival analysis, DNMs were associated with an increased risk of mortality (HR 1.44, 95% CI: 1.10–1.88, p = 0.009).
Vascular Medicine | 2016
Todd C. Crawford; Robert J. Beaulieu; Bryan A. Ehlert; Elizabeth V Ratchford; James H. Black
Aortic dissection remains a challenging clinical scenario, especially when complicated by peripheral malperfusion. Improvements in medical imaging have furthered understanding of the pathophysiology of malperfusion events in association with aortic dissection, including the elucidation of different mechanisms of branch vessel obstruction. Despite these advances, malperfusion syndrome remains a deadly entity with significant mortality. This review presents the latest knowledge regarding the pathogenesis of aortic dissection complicated by malperfusion syndrome, and discusses the diagnostic and therapeutic guidelines for management of this vicious entity.
Seminars in Thoracic and Cardiovascular Surgery | 2016
Todd C. Crawford; Jonathan Trent Magruder; Joshua C. Grimm; Christopher M. Sciortino; John V. Conte; Bo S. Kim; Robert S.D. Higgins; Duke E. Cameron; Marc Sussman; Glenn J. Whitman
Shorter intubation periods after cardiac surgery are associated with decreased morbidity and mortality. Although the Society of Thoracic Surgeons uses a 6-hour benchmark for early extubation, the time threshold above which complications increase is unknown. Using an institutional Society of Thoracic Surgeons database, we identified 3007 adult patients who underwent 1 of 7 index cardiac operations from 2010-2014. Patients were stratified by the duration of time to extubation after surgery-0-6, 6-9, 9-12, and 12-18 hours. Aggregate outcomes were compared among time-to-extubation cohorts. Primary outcomes included operative mortality and a composite of major postoperative complications; secondary outcomes included prolonged postoperative hospital length of stay (PLOS) (> 14 days) and reintubation. Multivariable logistic regression analysis was used to control for case mix. In results, extubation percentages in each time cohort were hours 0-6-36.4%, 6-9-25.6%, 9-12-12.5%, and 12-18-10.5%. Patients extubated in hours 12-18 vs < 12 experienced a significantly higher risk of operative mortality (odds ratio = 2.7, 95% CI: 1.0-7.5, P = 0.05) and the composite complication outcome (odds ratio = 3.6, 95% CI: 2.2-6.1, P < 0.01); however, insignificant differences were observed in those extubated in hours 6-9 vs 0-6 nor in hours 9-12 vs 0-9. An identical trend was observed for our secondary outcomes of PLOS and reintubation. In conclusion, our results indicate that the risks of operative mortality, major morbidity, and PLOS do not significantly increase until the time interval to extubation exceeds 12 hours. Cardiac surgery programs should be evaluated on their ability to extubate patients within this time interval.
The Annals of Thoracic Surgery | 2018
Jonathan C. Hong; Manoj K. Saraswat; Trevor A. Ellison; J. Trent Magruder; Todd C. Crawford; Julia M. Gardner; William V. Padula; Glenn J. Whitman
BACKGROUND Cardiac surgery patients colonized with Staphylococcus aureus have a greater risk of surgical site infection (SSI). The purpose of this study was to evaluate the cost-effectiveness of decolonization strategies to prevent SSIs. METHODS We compared three decolonization strategies: universal decolonization (UD), all subjects treated; targeted decolonization (TD), only S aureus carriers treated; and no decolonization (ND). Decolonization included mupirocin, chlorhexidine, and vancomycin. We implemented a decision tree comparing the costs and quality-adjusted life-years (QALYs) of these strategies on SSI over a 1-year period for subjects undergoing coronary artery bypass graft surgery from a US health sector perspective. Deterministic and probabilistic sensitivity analyses were conducted to address the uncertainty in the variables. RESULTS Universal decolonization was the dominant strategy because it resulted in reduced costs at near-equal QALYs compared with TD and ND. Compared with ND, UD decreased costs by
Current Treatment Options in Cardiovascular Medicine | 2017
Jonathan C. Hong; Todd C. Crawford; Harikrishna Tandri; Kaushik Mandal
462 and increased QALYs by 0.002 per subject, whereas TD decreased costs by