Catrina Cropano
Harvard University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Catrina Cropano.
Journal of Parenteral and Enteral Nutrition | 2016
D. Dante Yeh; Eva Fuentes; Sadeq A. Quraishi; Catrina Cropano; Haytham M.A. Kaafarani; Jarone Lee; David R. King; Marc DeMoya; Peter J. Fagenholz; Kathryn L. Butler; Yuchiao Chang; George C. Velmahos
BACKGROUND Macronutrient deficit in the surgical intensive care unit (ICU) is associated with worse in-hospital outcomes. We hypothesized that increased caloric and protein deficit is also associated with a lower likelihood of discharge to home vs transfer to a rehabilitation or skilled nursing facility. MATERIALS AND METHODS Adult surgical ICU patients receiving >72 hours of enteral nutrition (EN) between March 2012 and May 2014 were included. Patients with absolute contraindications to EN, <72-hour ICU stay, moribund state, EN prior to surgical ICU admission, or previous ICU admission within the same hospital stay were excluded. Subjects were dichotomized by cumulative caloric (<6000 vs ≥ 6000 kcal) and protein deficit (<300 vs ≥ 300 g). Baseline characteristics and outcomes were compared using Wilcoxon rank and χ(2) tests. To test the association of macronutrient deficit with discharge destination (home vs other), we performed a logistic regression analysis, controlling for plausible confounders. RESULTS In total, 213 individuals were included. Nineteen percent in the low-caloric deficit group were discharged home compared with 6% in the high-caloric deficit group (P = .02). Age, body mass index (BMI), Acute Physiology and Chronic Health Evaluation II (APACHE II), and initiation of EN were not significantly different between groups. On logistic regression, adjusting for BMI and APACHE II score, the high-caloric and protein-deficit groups were less likely to be discharged home (odds ratio [OR], 0.28; 95% confidence interval [CI], 0.08-0.96; P = .04 and OR, 0.29; 95% CI, 0.0-0.89, P = .03, respectively). CONCLUSIONS In surgical ICU patients, inadequate macronutrient delivery is associated with lower rates of discharge to home. Improved nutrition delivery may lead to better clinical outcomes after critical illness.
Journal of Trauma-injury Infection and Critical Care | 2014
Gwendolyn M. van der Wilden; Yuchiao Chang; Catrina Cropano; Melanie Subramanian; Inger B. Schipper; D. Dante Yeh; David R. King; Marc de Moya; Peter J. Fagenholz; George C. Velmahos
BACKGROUND Of the patients with a Clostridium difficile infection, 2% to 8% will progress to fulminant C. difficile colitis (fCDC), which carries high morbidity and mortality. No system exists to rapidly identify patients at risk for developing fCDC and possibly in need of surgical intervention. Our aim was to design a simple and accurate risk scoring system (RSS) for daily clinical practice. METHODS We prospectively enrolled all patients diagnosed with a C. difficile infection and compared patients with and without fCDC. An expert panel, combined with data derived from previous studies, identified four risk factors, and a multivariable logistic regression model was performed to determine their effect in predicting fCDC. The RSS was created based on the predictive power of each factor, and calibration, discrimination, and test characteristics were subsequently determined. In addition, the RSS was compared with a previously proposed severity scoring system. RESULTS A total of 746 patients diagnosed with C. difficile infection were enrolled between November 2010 and October 2012. Based on the log (odds ratio) of each risk factor, age greater than 70 years was assigned 2 points, white blood cell count equal to or greater than 20,000/&mgr;L or equal to or less than 2,000/&mgr;L was assigned 1 point, cardiorespiratory failure was assigned 7 points, and diffuse abdominal tenderness on physical examination was assigned 6 points. With the use of this system, the discriminatory value of the RSS (c statistic) was 0.98 (95% confidence interval, 0.96–1).The Hosmer-Lemeshow goodness-of-fit test showed a p value of 0.78, and the Brier score was 0.019. A value of 6 points was determined to be the threshold for reliably dividing low-risk ( <6) from high-risk (≥6) patients. CONCLUSION The RSS is a valid and reliable tool to identify at the bedside patients who are at risk for developing fCDC. External validation is needed before widespread implementation. LEVEL OF EVIDENCE Prognostic study, level II.
Journal of Trauma-injury Infection and Critical Care | 2015
Haytham M.A. Kaafarani; Jarone Lee; Catrina Cropano; Yuchiao Chang; Toby Raybould; Eric Klein; Alice Gervasini; Laurie Petrovick; Chris DePesa; Carlos A. Camargo; George C. Velmahos; Peter T. Masiakos
BACKGROUND Graduated driving licensing (GDL) programs phase in driving privileges for teenagers. We aimed to evaluate the effect of the 2007 GDL law on the incidence of total motor vehicle crashes (tMVCs) and fatal motor vehicle crashes (fMVCs) among teenagers in Massachusetts. METHODS The Fatality Analysis and Reporting System, the Missouri Census Data Center, and the Massachusetts Department of Transportation databases were all used to create and compare the incidence of tMVCs and fMVCs before (2002–2006) and after (2007–2011) the law enactment. The following three driver age groups were studied: 16 years to 17 years (evaluating the law effect), 18 years to 20 years (evaluating the sustainability of the effect), and 25 years to 29 years (control group). As a sensitivity analysis, we compared the incidence rates per population and per licenses issued. RESULTS tMVCs decreased following the law for all three age groups (16–17 years, from 7.6 to 4.8 per 1,000 people, p < 0.0001; 18–20 years, from 8.5 to 6.4 per 1,000 people, p < 0.0001; 25–29 years, from 6.2 to 5.2 per 1,000 people, p < 0.0001), but the percentage decrease in tMVC rates was less in the control group (37%, 25%, and 15%, respectively; both p’s < 0.0001). The rates of fMVC also decreased in the age groups of 16 years to 17 years (from 14.0 to 8.6 per 100,000 people, p = 0.0006), 18 years to 20 years (from 21.2 to 13.7 per 100,000 people, p < 0.0001), and 25 years to 29 years (from 14.4 to 11.0 per 100,000 people, p < 0.0001). All of these results were confirmed in the sensitivity analyses. CONCLUSION The 2007 Massachusetts GDL was associated with a decreased incidence of teenager tMVCs and fMVCs, and the effect was sustainable. This study provides further support to develop, implement, enforce, and maintain GDL programs aimed at preventing MVCs and their related mortality in the young novice driver population. LEVEL OF EVIDENCE Epidemiologic/prognostic study, level III.
Journal of Trauma-injury Infection and Critical Care | 2013
Ali Y. Mejaddam; Catrina Cropano; Sanjeeva P. Kalva; T. Gregory Walker; Ayesha M. Imam; George C. Velmahos; Marc de Moya; David R. King
BACKGROUND Therapeutic angioembolization is a relatively new “rescue treatment” modality for gastrointestinal hemorrhage (GIH) for unstable patients who fail primary treatment approaches; however, the effectiveness of this treatment and the incidence of ischemic necrosis following embolization for acute GIH are poorly described. The purpose of this study was to evaluate the effectiveness and safety of “rescue” transcatheter superselective angioembolization (SSAE) for the treatment of hemodynamically unstable patients with GIH. METHODS A 10-year retrospective review of all hemodynamically unstable patients (systolic blood pressure < 90 mm Hg and ongoing transfusion requirement) who underwent “rescue” SSAE for GIH after failed endoscopic management was performed. All patients with evidence of active contrast extravasation were included. Data were collected on demographics, comorbidities, clinical presentation, and type of intravascular angioembolic agent used. Outcomes included technical success (cessation of extravasation), clinical success (no rebleeding requiring intervention within 30 days), and incidence of ischemic complications. RESULTS Ninety-eight patients underwent SSAE for GIH during the study period; 47 were excluded owing to lack of active contrast extravasation. Of the remaining 51 patients, 22 (43%) presented with a lower GIH and 29 (57%) with upper GIH. The majority underwent embolization with a permanent agent (71%), while the remaining patients received either a temporary agent (16%) or a combination (14%). The overall technical and clinical success rates were 98% and 71%, respectively. Of the 14 patients with technical success but clinical failure (rebleeding within 30 days) and the 1 patient with technical failure, 4 were managed successfully with reembolization, while 2 underwent successful endoscopic therapy, and 9 had surgical resections. Only one patient had an ischemic complication (small bowel necrosis) requiring resection. CONCLUSION SSAE, with reembolization if necessary, is an effective rescue treatment modality for hemodynamically unstable patients with active GIH. Of the patients, 20% will fail SSAE and require additional intervention. Ischemic complications are extremely rare. LEVEL OF EVIDENCE Therapeutic study, level IV.
Nutrition in Clinical Practice | 2017
D. Dante Yeh; Catrina Cropano; Sadeq A. Quraishi; Eva Fuentes; Haytham M.A. Kaafarani; Jarone Lee; Yuchiao Chang; George C. Velmahos
Background: Macronutrient deficiency in critical illness is associated with worse outcomes. We hypothesized that an aggressive enteral nutrition (EN) protocol would result in higher macronutrient delivery and fewer late infections. Methods: We enrolled adult surgical intensive care unit (ICU) patients receiving >72 hours of EN from July 2012 to June 2014. Our intervention consisted of increasing protein prescription (2.0–2.5 vs 1.5–2.0 g/kg/d) and compensatory feeds for EN interruption. We compared the intervention group with historical controls. To test the association of the aggressive EN protocol with the risk of late infections (defined as occurring >96 hours after ICU admission), we performed a Poisson regression analysis, while controlling for age, sex, body mass index (BMI), Acute Physiology and Chronic Health Evaluation II (APACHE II) score, and exposure to gastrointestinal surgery. Results: The study cohort comprised 213 patients, who were divided into the intervention group (n = 119) and the historical control group (n = 94). There was no difference in age, sex, BMI, admission category, or Injury Severity Score between the groups. Mean APACHE II score was higher in the intervention group (17 ± 8 vs 14 ± 6, P = .002). The intervention group received more calories (19 ± 5 vs 17 ± 6 kcal/kg/d, P = .005) and protein (1.2 ± 0.4 vs 0.8 ± 0.3 g/kg/d, P < .001), had a higher percentage of prescribed calories (77% vs 68%, P < .001) and protein (93% vs 64%, P < .001), and accumulated a lower overall protein deficit (123 ± 282 vs 297 ± 233 g, P < .001). On logistic regression, the intervention group had fewer late infections (adjusted odds ratio, 0.34; 95% confidence interval, 0.14–0.83). Conclusions: In surgical ICU patients, implementation of an aggressive EN protocol resulted in greater macronutrient delivery and fewer late infections.
Journal of Trauma-injury Infection and Critical Care | 2015
D. Dante Yeh; Catrina Cropano; Peter J. Fagenholz; David R. King; Yuchiao Chang; Eric Klein; Marc DeMoya; Haytham M.A. Kaafarani; George C. Velmahos
BACKGROUND Gangrenous cholecystitis (GC) is difficult to diagnose preoperatively in the patient with suspected acute cholecystitis. We sought to characterize preoperative risk factors and post-operative complications. METHODS Pathology reports of all patients undergoing cholecystectomy for suspected acute cholecystitis from June 2010 to January 2014 and admitted through the emergency department were examined. Patients with GC were compared with those with acute/chronic cholecystitis (AC/CC). Data collected included demographics, preoperative signs and symptoms, radiologic studies, operative details, and clinical outcomes. RESULTS Thirty-eight cases of GC were identified and compared with 171 cases of AC/CC. Compared with AC/CC, GC patients were more likely to be older (57 years vs. 41 years, p < 0.001), of male sex (63% vs. 31%, p < 0.001), hypertensive (47% vs. 22%, p = 0.002), hyperlipidemic (29% vs. 14%, p = 0.026), and diabetic (24% vs. 8%, p = 0.006). GC patients were more likely to have a fever (29% vs. 12%, p = 0.007) and less likely to have nausea/vomiting (61% vs. 80%, p = 0.019) or an impacted gallstone on ultrasound (US) (8% vs. 26%, p = 0.017). Otherwise, there was no significant difference in clinical or US findings. Among GC patients, US findings were absent (8%, n = 3) or minimal (42%, n = 16). Median time from emergency department registration to US (3.3 hours vs. 2.8 hours, p = 0.28) was similar, but US to operation was longer (41.2 hours vs. 18.4 hours, p < 0.001), conversion to open cholecystectomy was more common (37% vs. 10%, p < 0.001), and hospital stay was longer (median, 4 days vs. 2 days, p < 0.0001). Delay in surgical consultation occurred in 16% of GC patients compared with 1% of AC patients (p < 0.001). CONCLUSION Demographic features may be predictive of GC. Absent or minimal US signs occur in 50%, and delay in surgical consultation is common. Postoperative morbidity is greater for patients with GC compared with those with AC/CC. LEVEL OF EVIDENCE Epidemiologic study, level III; therapeutic study, level IV.
Journal of Emergencies, Trauma, and Shock | 2015
D. Dante Yeh; Gwendolyn M. van der Wilden; Catrina Cropano; Yuchiao Chang; David R. King; Marc de Moya; Peter J. Fagenholz; Haytham M.A. Kaafarani; Jarone Lee; George C. Velmahos
Background: Excessive crystalloid administration is common and associated with negative outcomes in critically ill trauma patients. Continuous furosemide infusion (CFI) to remove excessive fluid has not been previously described in this population. We hypothesized that a goal-directed CFI is more effective for fluid removal than intermittent bolus injection (IBI) diuresis without excess incidence of hypokalemia or renal failure. Materials and Methods: CFI cases were prospectively enrolled between November 2011 and August 2012, and matched to historic IBI controls by age, gender, Injury Severity Score (ISS), and net fluid balance (NFB) at diuresis initiation. Paired and unpaired analyses were performed to compare groups. The primary endpoints were net fluid balance, potassium and creatinine levels. Secondary endpoints included intensive care unit (ICU) and hospital length of stay (LOS), ventilator-free days (VFD), and mortality. Results: 55 patients were included, with 19 cases and 36 matched controls. Mean age was 54 years, mean ISS was 32.7, and mean initial NFB was +7.7 L. After one day of diuresis with CFI vs. IBI, net 24 h fluid balance was negative (−0.55 L vs. +0.43 L, P = 0.026) only for the CFI group, and there was no difference in potassium and creatinine levels. Cumulative furosemide dose (59.4mg vs. 25.4mg, P < 0.001) and urine output (4.2 L vs. 2.8 L, P < 0.001) were also significantly increased with CFI vs. IBI. There were no statistically significant differences in ICU LOS, hospital LOS, VFD, or mortality. Conclusions: Compared to IBI, goal-directed diuresis by CFI is more successful in achieving net negative fluid balance in patients with fluid overload with no detrimental side effects on renal function or patient outcome.
Nutrition in Clinical Practice | 2016
Suzan Dijkink; Eva Fuentes; Sadeq A. Quraishi; Catrina Cropano; Haytham M.A. Kaafarani; Jarone Lee; David R. King; Marc DeMoya; Peter J. Fagenholz; Kathryn L. Butler; George C. Velmahos; D. Dante Yeh
BACKGROUND Calorie/protein deficit in the surgical intensive care unit (SICU) is associated with worse clinical outcomes. It is customary to initiate enteral nutrition (EN) at a low rate and increase to goal (RAMP-UP). Increasing evidence suggests that RAMP-UP may contribute to iatrogenic malnutrition. We sought to determine what proportion of total SICU calorie/protein deficit is attributable to RAMP-UP. MATERIALS AND METHODS This is a retrospective study of a prospectively collected registry of adult patients (N = 109) receiving at least 72 hours of EN in the SICU according to the RAMP-UP protocol (July 2012-June 2014). Subjects receiving only trophic feeds or with interrupted EN during RAMP-UP were excluded. Deficits were defined as the amount of prescribed calories/protein minus the actual amount received. RAMP-UP deficit was defined as the deficit between EN initiation and arrival at goal rate. Data included demographics, nutritional prescription/delivery, and outcomes. RESULTS EN was started at a median of 34.0 hours (interquartile range [IQR], 16.5-53.5) after ICU admission, with a mean duration of 8.7 ± 4.3 days. The median total caloric deficit was 2185 kcal (249-4730), with 900 kcal (551-1562) attributable to RAMP-UP (41%). The protein deficit was 98.5 g (27.5-250.4), with 51.9 g (20.6-83.3) caused by RAMP-UP (53%). CONCLUSIONS In SICU patients initiating EN, the RAMP-UP period accounted for 41% and 53% of the overall caloric and protein deficits, respectively. Starting EN immediately at goal rate may eliminate a significant proportion of macronutrient deficit in the SICU.
American Journal of Emergency Medicine | 2018
D. Dante Yeh; Yuchiao Chang; Maryam Bita Tabrizi; Liyang Yu; Catrina Cropano; Peter J. Fagenholz; David R. King; Haytham M.A. Kaafarani; Marc de Moya; George C. Velmahos
Objective: We sought to develop a practical Bedside Score for the diagnosis of cholecystitis and test its accuracy against the Tokyo Guidelines (TG13). Methods: We conducted a retrospective study of 438 patients undergoing urban, academic Emergency Department (ED) evaluation of RUQ pain. Symptoms, physical signs, ultrasound signs, and labs were scoring system candidates. A random split‐sample approach was used to develop and validate a new clinical score. Multivariable regression analysis using development data was conducted to identify predictors of cholecystitis. Cutoff values were chosen to ensure positive/negative predictive values (PPV, NPV) of at least 0.95. The score was externally validated in 80 patients at a different hospital undergoing RUQ pain evaluation. Results: 230 patients (53%) had cholecystitis. Five variables predicted cholecystitis and were included in the scores: gallstones, gallbladder thickening, clinical or ultrasonographic Murphys sign, RUQ tenderness, and post‐prandial symptoms. A clinical prediction score was developed. When dichotomized at 4, overall accuracy for acute cholecystitis was 90% for the development cohort, 82% and 86% for the internal and external validation cohorts; TG13 accuracy was 62%–79%. Conclusions: A clinical prediction score for cholecystitis demonstrates accuracy equivalent to TG13. Use of this score may streamline work‐up by decreasing the need for comprehensive ultrasound evaluation and CRP measurement and may shorten ED length of stay.
Journal of special operations medicine : a peer reviewed journal for SOF medical professionals | 2013
Ali Y. Mejaddam; van der Wilden Gm; Yuchiao Chang; Catrina Cropano; Antonis Sideris; John O. Hwabejire; George C. Velmahos; H.B. Alam; de Moya Ma; David R. King