Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephen D. Surgenor is active.

Publication


Featured researches published by Stephen D. Surgenor.


Pediatric Anesthesia | 2000

Emergence agitation in paediatric patients after sevoflurane anaesthesia and no surgery: a comparison with halothane

Joseph P. Cravero; Stephen D. Surgenor; Kate Whalen

This study was designed to compare the emergence characteristics of sevoflurane with halothane anaesthesia in paediatric patients having no surgical intervention. We randomized 32 ASA I or II paediatric outpatients scheduled for magnetic resonance imaging scans to receive either halothane or sevoflurane anaesthesia. The primary outcome measure was the percentage of patients with emergence agitation, as defined by two different criteria. Time to discharge from the postanaesthesia care unit (PACU) and the secondary recovery unit (SRU) were compared. Sevoflurane patients had a greater incidence of emergence delirium when a high threshold for agitation was defined (33% vs. 0%, P = 0.010) and a lower threshold for agitation was applied (80% vs. 12%, P < 0.0001). Discharge times from the PACU and the SRU were not different. We conclude that there is an increased incidence of emergence agitation with sevoflurane anaesthesia compared to halothane independent of any painful stimulus.


Anesthesia & Analgesia | 2009

The association of perioperative red blood cell transfusions and decreased long-term survival after cardiac surgery.

Stephen D. Surgenor; Robert S. Kramer; Elaine M. Olmstead; Cathy S. Ross; Frank W. Sellke; Donald S. Likosky; Charles A. S. Marrin; Robert E. Helm; Bruce J. Leavitt; Jeremy R. Morton; David C. Charlesworth; Robert A. Clough; Felix Hernandez; Carmine Frumiento; Arnold Benak

BACKGROUND: Exposure to red blood cell (RBC) transfusions has been associated with increased mortality after cardiac surgery. We examined long-term survival for cardiac surgical patients who received one or two RBC units during index hospitalization. METHODS: Nine thousand seventy-nine consecutive patients undergoing coronary artery bypass graft, valve, or coronary artery bypass graft/valve surgery at eight centers in northern New England during 2001-2004 were examined after exclusions. A probabilistic match between the regional registry and the Social Security Administration’s Death Master File determined mortality through June 30, 2006. Cox Proportional Hazard and propensity methods were used to calculate adjusted hazard ratios. RESULTS: Thirty-six percent of patients (n = 3254) were exposed to one or two RBC units. Forty-three percent of RBCs were given intraoperatively, 56% in the postoperative period and 1% were preoperative. Patients transfused were more likely to be anemic, older, smaller, female and with more comorbid illness. Survival was significantly decreased for all patients exposed to 1 or 2 U of RBCs during hospitalization for cardiac surgery compared with those who received none (P < 0.001). After adjustment for patient and disease characteristics, patients exposed to 1 or 2 U of RBCs had a 16% higher long-term mortality risk (adjusted hazard ratios = 1.16, 95% CI: 1.01-1.34, P = 0.035). CONCLUSIONS: Exposure to 1 or 2 U of RBCs was associated with a 16% increased hazard of decreased survival after cardiac surgery.


Circulation | 2006

Intraoperative Red Blood Cell Transfusion During Coronary Artery Bypass Graft Surgery Increases the Risk of Postoperative Low-Output Heart Failure

Stephen D. Surgenor; Gordon R. DeFoe; Mary P. Fillinger; Donald S. Likosky; Robert C. Groom; Cantwell Clark; Robert E. Helm; Robert S. Kramer; Bruce J. Leavitt; John D. Klemperer; Charles F Krumholz; Benjamin M. Westbrook; Dean J. Galatis; Carmine Frumiento; Cathy S. Ross; Elaine M. Olmstead; Gerald T. O'Connor

Background— Hemodilutional anemia during cardiopulmonary bypass (CPB) is associated with increased mortality during coronary artery bypass graft (CABG) surgery. The impact of intraoperative red blood cell (RBC) transfusion to treat anemia during surgery is less understood. We examined the relationship between anemia during CPB, RBC transfusion, and risk of low-output heart failure (LOF). Methods and Results— Data were collected on 8004 isolated CABG patients in northern New England between 1996 and 2004. Patients were excluded if they experienced postoperative bleeding or received ≥3 units of transfused RBCs. LOF was defined as need for intraoperative or postoperative intra-aortic balloon pump, return to CPB, or ≥2 inotropes at 48 hours. Having a lower nadir HCT was also associated with an increased risk of developing LOF (adjusted odds ratio, 0.90; 95% CI, 0.82 to 0.92; P=0.016), and that risk was further increased when patients received RBC transfusion. When adjusted for nadir hematocrit, exposure to RBC transfusion was a significant, independent predictor of LOF (adjusted odds ratio, 1.27; 95% CI, 1.00 to 1.61; P=0.047). Conclusions— In this study, we observed that exposure to both hemodilutional anemia and RBC transfusion during surgery are associated with increased risk of LOF, defined as placement of an intraoperative or postoperative intra-aortic balloon pump, return to CPB after initial separation, or treatment with ≥2 inotropes at 48 hours postoperatively, after CABG. The risk of LOF is greater among patients exposed to intraoperative RBCs versus anemia alone.


Anesthesia & Analgesia | 2011

Hand contamination of anesthesia providers is an important risk factor for intraoperative bacterial transmission.

Randy W. Loftus; Matthew K. Muffly; Jeremiah R. Brown; Michael L. Beach; Matthew D. Koff; Howard L. Corwin; Stephen D. Surgenor; Kathryn B. Kirkland; Mark P. Yeager

BACKGROUND:We have recently shown that intraoperative bacterial transmission to patient IV stopcock sets is associated with increased patient mortality. In this study, we hypothesized that bacterial contamination of anesthesia provider hands before patient contact is a risk factor for direct intraoperative bacterial transmission. METHODS:Dartmouth–Hitchcock Medical Center is a tertiary care and level 1 trauma center with 400 inpatient beds and 28 operating suites. The first and second operative cases in each of 92 operating rooms were randomly selected for analysis. Eighty-two paired samples were analyzed. Ten pairs of cases were excluded because of broken or missing sampling protocol and lost samples. We identified cases of intraoperative bacterial transmission to the patient IV stopcock set and the anesthesia environment (adjustable pressure-limiting valve and agent dial) in each operating room pair by using a previously validated protocol. We then used biotype analysis to compare these transmitted organisms to those organisms isolated from the hands of anesthesia providers obtained before the start of each case. Provider-origin transmission was defined as potential pathogens isolated in the patient stopcock set or environment that had an identical biotype to the same organism isolated from hands of providers. We also assessed the efficacy of the current intraoperative cleaning protocol by evaluating isolated potential pathogens identified at the start of case 2. Poor intraoperative cleaning was defined as 1 or more potential pathogens found in the anesthesia environment at the start of case 2 that were not there at the beginning of case 1. We collected clinical and epidemiological data on all the cases to identify risk factors for contamination. RESULTS:One hundred sixty-four cases (82 case pairs) were studied. We identified intraoperative bacterial transmission to the IV stopcock set in 11.5 % (19/164) of cases, 47% (9/19) of which were of provider origin. We identified intraoperative bacterial transmission to the anesthesia environment in 89% (146/164) of cases, 12% (17/146) of which were of provider origin. The number of rooms that an attending anesthesiologist supervised simultaneously, the age of the patient, and patient discharge from the operating room to an intensive care unit were independent predictors of bacterial transmission events not directly linked to providers. CONCLUSION:The contaminated hands of anesthesia providers serve as a significant source of patient environmental and stopcock set contamination in the operating room. Additional sources of intraoperative bacterial transmission, including postoperative environmental cleaning practices, should be further studied.


Circulation | 2006

Long-Term Survival of Patients With Chronic Obstructive Pulmonary Disease Undergoing Coronary Artery Bypass Surgery

Bruce J. Leavitt; Cathy S. Ross; Brian Spence; Stephen D. Surgenor; Elaine M. Olmstead; Robert A. Clough; David C. Charlesworth; Robert S. Kramer; Gerald T. O’Connor

Background— Chronic obstructive pulmonary disease (COPD) is associated with increased in-hospital mortality in patients undergoing coronary artery bypass surgery (CABG). Long-term survival is less well understood. The present study examined the effect of COPD on survival after CABG. Methods and Results— We conducted a prospective study of 33 137 consecutive isolated CABG patients between 1992 and 2001 in northern New England. Records were linked to the National Death Index for long-term mortality data. Cox proportional hazards regression was used to calculate hazard ratios (HRs). Patients were stratified by: no comorbidities (none), COPD, COPD plus comorbidities, and other comorbidities with no COPD. There were 131 434 person years of follow-up and 5344 deaths. The overall incidence rate (deaths per 100 person years) was 4.1. By group, rates were: 2.1 (none), 4.0 (COPD alone), 5.5 (other), and 9.4 (COPD plus; log rank P<0.001). After adjustment, survival with COPD alone was worse compared with none (HR, 1.8; 95% CI, 1.6 to 2.1; P<0.001). Patients with other comorbidities compared with none had even worse survival (HR, 2.2; 95% CI, 2.1 to 2.4; P<0.001). Patients with COPD plus other comorbidities compared with none had the worst long-term survival (HR, 3.6; 95% CI, 3.3 to 3.9; P<0.001). Conclusions— Patients with only COPD had significantly reduced long-term survival compared with patient with no comorbidities. Patients with COPD and ≥1 other comorbidity had the worst survival rate when compared with all of the other groups.


Journal of Clinical Monitoring and Computing | 1999

A graphical object display improves anesthesiologists' performance on a simulated diagnostic task.

George T. Blike; Stephen D. Surgenor; Kate Whalen

Objective. This study tests the hypothesis that a graphical object display (a data display consisting of meaningful shapes) will affect the ability of anesthesiologists to perform a diagnostic task rapidly and correctly. The diagnostic tasks studied were recognition and differentiation of five etiologies of shock – anaphylaxis, bradycardia, myocardial ischemia, hypovolemia, pulmonary embolus. Methods. Data sets consisting of HR, Systemic Arterial BP, Pulmonary Arterial BP, CVP, and Cardiac Output were generated for five shock states and five non-shock states. The resulting 10 data sets were presented on a computer monitor to study subjects twice (first in an alpha-numeric format and then in the object format) for a total of twenty decision screens. Subjects used soft-buttons on a computer touch-screen monitor to: a) advance to the next display; b) differentiate a non-shock state from a shock state; and, c) select the etiology of shock state represented by the display (Figure 2). Data collection was automatic, using the internal clock and memory of the computer. Results.Eleven anesthesiologists participated in this study. They completed a total of 3060 diagnostic decisions, half with each display format. Performance measures were time to decision and diagnostic accuracy. The object display improved no-shock recognition by 1.0 second and shock etiology determination by 1.4 seconds (p < 0.05). The object display also significantly improved accuracy for shock recognition by 1.4% and etiology determination by 4.1% (p < 0.05). Testing was completed in a time interval of <45 min per 10 trials. Conclusions. The primary finding of this study was that anesthesiologists using the object display format committed significantly fewer diagnostic errors when interpreting physiologic data. In addition, both the recognition of no-shock and the diagnosis of shock etiology were completed more rapidly when the object display was used. The major limitation of this initial trial is the simplicity of the test. Future investigation of the impact of the display on clinical decision making will require more realistic clinical scenarios with partial or full simulation to better understand the potential clinical impact.


Critical Care Medicine | 2003

Transfusion practice in the critically ill.

Howard L. Corwin; Stephen D. Surgenor; Andrew Gettinger

BackgroundAnemia in the critically ill patient population is common. This anemia of critical illness is a distinct clinical entity characterized by blunted erythropoietin production and abnormalities in iron metabolism identical to what is commonly referred to as the anemia of chronic disease. FindingsAs a result of this anemia, critically ill patients receive an extraordinarily large number of blood transfusions. Between 40% and 50% of all patients admitted to intensive care units receive at least one red blood cell unit, and the average is close to five red blood cell units during their intensive care unit stay. There is little evidence that “routine” transfusion of stored allogeneic red blood cells is beneficial for critically ill patients. Most critically ill patients can tolerate hemoglobin levels as low as 7 mg/dL, so a more conservative approach to red blood cell transfusion is warranted. ConclusionPractice strategies should be directed toward a reduction of blood loss (phlebotomy) and a decrease in the transfusion threshold in critically ill patients.


Anesthesia & Analgesia | 2001

Predicting the Risk of Death from Heart Failure After Coronary Artery Bypass Graft Surgery

Stephen D. Surgenor; Gerald T. O’Connor; Stephen J. Lahey; Reed D. Quinn; David C. Charlesworth; Lawrence J. Dacey; Robert A. Clough; Bruce J. Leavitt; Gordon R DeFoe; Mary P. Fillinger; William C. Nugent

Heart failure is the most common cause of death among coronary artery bypass graft (CABG) patients. In addition, most variation in observed mortality rates for CABG surgery is explained by fatal heart failure. The purpose of this study was to develop a clinical risk assessment tool so that clinicians can rapidly and easily assess the risk of fatal heart failure while caring for individual patients. Using prospective data for 8,641 CABG patients, we used logistic regression analysis to predict the risk of fatal heart failure. In multivariate analysis, female sex, prior CABG surgery, ejection fraction <40%, urgent or emergency surgery, advanced age (70–79 yr and >80 yr), peripheral vascular disease, diabetes, dialysis-dependent renal failure and three-vessel coronary disease were significant predictors of fatal postoperative heart failure. A clinical risk assessment tool was developed from this logistic regression model, which had good discriminating characteristics (receiver operating characteristic clinical source = 0.75, 95% confidence interval: 0.71, 0.78).


Anesthesia & Analgesia | 2002

The association between heart rate and in-hospital mortality after coronary artery bypass graft surgery

Mary P. Fillinger; Stephen D. Surgenor; Gregg S. Hartman; Cantwell Clark; Thomas M. Dodds; Athos J. Rassias; William C. Paganelli; Peter Marshall; David Johnson; Dennis Kelly; Dean J. Galatis; Elaine M. Olmstead; Cathy S. Ross; Gerald T. O'Connor

Avoidance of tachycardia is a commonly described goal for anesthetic management during coronary artery bypass graft (CABG) surgery. However, an association between increased intraoperative heart rate and mortality has not been described. We conducted an observational study to evaluate the association between preinduction heart rate (heart rate upon arrival to the operating room) and in-hospital mortality during CABG surgery. Data were collected on 5934 CABG patients. Fifteen percent of patients had an increased preinduction heart rate ≥80 bpm. Crude mortality was significantly more frequent among patients with increased preinduction heart rate (Ptrend = 0.002). After adjustment for baseline differences among patients, preinduction heart rate ≥80 bpm remained associated with increased mortality (Ptrend < 0.001). The increased heart rate may be a cause of the observed mortality. Alternatively, faster heart rate may be either a marker of patients with irreversible myocardial damage, or a marker of patients with limited cardiac reserve at risk for further injury. Lastly, faster heart rate may be a marker for under-use of &bgr;-adrenergic blockade. Because the use of preoperative &bgr;-adrenergic blockade in CABG patients is associated with improved in-hospital survival, further investigation concerning the effect of intraoperative treatment of increased heart rate with &bgr;-adrenergic blockers on mortality after CABG surgery is warranted.


Anesthesia & Analgesia | 2012

Prevention of intravenous bacterial injection from health care provider hands: the importance of catheter design and handling.

Randy W. Loftus; Hetal M. Patel; Bridget C. Huysman; David P. Kispert; Matthew D. Koff; John D. Gallagher; Jens Jensen; John Rowlands; Sundara Reddy; Thomas M. Dodds; Mark P. Yeager; Kathryn L. Ruoff; Stephen D. Surgenor; Jeremiah R. Brown

BACKGROUND:Device-related bloodstream infections are associated with a significant increase in patient morbidity and mortality in multiple health care settings. Recently, intraoperative bacterial contamination of conventional open-lumen 3-way stopcock sets has been shown to be associated with increased patient mortality. Intraoperative use of disinfectable, needleless closed catheter devices (DNCCs) may reduce the risk of bacterial injection as compared to conventional open-lumen devices due to an intrinsic barrier to bacterial entry associated with valve design and/or the capacity for surface disinfection. However, the relative benefit of DNCC valve design (intrinsic barrier capacity) as compared to surface disinfection in attenuation of bacterial injection in the clinical environment is untested and entirely unknown. The primary aim of the current study was to investigate the relative efficacy of a novel disinfectable stopcock, the Ultraport zero, with and without disinfection in attenuating intraoperative injection of potential bacterial pathogens as compared to a conventional open-lumen stopcock intravascular device. The secondary aims were to identify risk factors for bacterial injection and to estimate the quantity of bacterial organisms injected during catheter handling. METHODS:Four hundred sixty-eight operating room environments were randomized by a computer generated list to 1 of 3 device-injection schemes: (1) injection of the Ultraport zero stopcock with hub disinfection before injection, (2) injection of the Ultraport zero stopcock without prior hub disinfection, and (3) injection of the conventional open-lumen stopcock closed with sterile caps according to usual practice. After induction of general anesthesia, the primary anesthesia provider caring for patients in each operating room environment was asked to perform a series of 5 injections of sterile saline through the assigned device into an ex vivo catheter system. The primary outcome was the incidence of bacterial contamination of the injected fluid column (effluent). Risk factors for effluent contamination were identified in univariate analysis, and a controlled laboratory experiment was used to generate an estimate of the bacterial load injected for contaminated effluent samples. RESULTS:The incidence of effluent bacterial contamination was 0% (0/152) for the Ultraport zero stopcock with hub disinfection before injection, 4% (7/162) for the Ultraport zero stopcock without hub disinfection before injection, and 3.2% (5/154) for the conventional open-lumen stopcock. The Ultraport zero stopcock with hub disinfection before injection was associated with a significant reduction in the risk of bacterial injection as compared to the conventional open-lumen stopcock (RR = 8.15 × 10−8, 95% CI, 3.39 × 10−8 to 1.96 × 10−7, P = <0.001), with an absolute risk reduction of 3.2% (95% CI, 0.5% to 7.4%). Provider glove use was a risk factor for effluent contamination (RR = 10.48, 95% CI, 3.16 to 34.80, P < 0.001). The estimated quantity of bacteria injected reached a clinically significant threshold of 50,000 colony-forming units per each injection series. CONCLUSIONS:The Ultraport zero stopcock with hub disinfection before injection was associated with a significant reduction in the risk of inadvertent bacterial injection as compared to the conventional open-lumen stopcock. Future studies should examine strategies designed to facilitate health care provider DNCC hub disinfection and proper device handling.

Collaboration


Dive into the Stephen D. Surgenor's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert A. Clough

Eastern Maine Medical Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert E. Helm

The Dartmouth Institute for Health Policy and Clinical Practice

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge