David H. Salzman
Northwestern University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David H. Salzman.
Academic Emergency Medicine | 2012
David H. Salzman; Douglas S. Franzen; Katrina A. Leone; Chad S. Kessler
Assessment of practice-based learning and improvement (PBLI) is a core concept identified in several competency frameworks. This paper summarizes the current state of PBLI assessment as presented at the 2012 Academic Emergency Medicine consensus conference on education research in emergency medicine. Based on these findings and consensus achieved at the conference, seven recommendations have been identified for future research.
Journal of Emergency Medicine | 2014
Amer Z. Aldeen; David H. Salzman; Michael A. Gisondi; D. Mark Courtney
BACKGROUND The Emergency Medicine In-Training Examination (EMITE) is one of the only valid tools for medical knowledge assessment in current use by emergency medicine (EM) residencies. However, EMITE results return late in the academic year, providing little time to institute potential remediation. OBJECTIVE The goal of this study was to determine the ability of EM faculty to accurately predict resident EMITE scores prior to results return. METHODS We asked EM faculty at the study site to predict the 2012 EMITE scores of the 50 EM residents 2 weeks prior to results being available. The primary outcome was prediction accuracy, defined as the proportion of predictions within 6% of the actual score. The secondary outcome was prediction precision, defined as the mean deviation of predictions from the actual scores. We assessed several faculty background variables, including years of experience, educational leadership status, and clinical hours worked, for correlation with the two outcomes. RESULTS Thirty-two of the 38 faculty (84.2%, 95% confidence interval [CI] 69.6-92.6) participated in the study, rendering a total of 1600 predictions for 50 residents. Mean resident EMITE score was 81.1% (95% CI 79.5-82.8%). Mean prediction accuracy for all faculty participants was 69% (95% CI 65.9-72.1%). Mean prediction precision was 5.2% (95% CI 4.9-5.5%). Education leadership status was the only background variable correlated with the primary and secondary outcomes (Spearmans ρ = 0.51 and -0.53, respectively). CONCLUSION Faculty possess only moderate accuracy at predicting resident EMITE scores. We recommend a multicenter study to evaluate the generalizability of the present results.
Teaching and Learning in Medicine | 2012
Danielle M. McCarthy; Katrina A. Leone; David H. Salzman; John A. Vozenilek; Kenzie A. Cameron
Background: The field of health literacy has closely examined the readability of written health materials to optimize patient comprehension. Few studies have examined spoken communication in a way that is comparable to analyses of written communication. Purpose: The study objective was to characterize the structural elements of residents’ spoken words while obtaining informed consent. Methods: Twenty-six resident physicians participated in a simulated informed consent discussion with a standardized patient. Audio recordings of the discussions were transcribed and analyzed to assess grammar statistics for evaluating language complexity (e.g., reading grade level). Transcripts and time values were used to assess structural characteristics of the dialogue (e.g., interactivity). Results: Discussions were characterized by physician verbal dominance. The discussions were interactive but showed significant differences between the physician and patient speech patterns for all language complexity metrics. Conclusions: In this study, physicians spoke significantly more and used more complex language than the patients.
Western Journal of Emergency Medicine | 2017
Andrew R Ketterer; David H. Salzman; Jeremy Branzetti; Michael A. Gisondi
Introduction Emergency medicine (EM) residency programs may be 36 or 48 months in length. The Residency Review Committee for EM requires that 48-month programs provide educational justification for the additional 12 months. We developed additional milestones that EM training programs might use to assess outcomes in domains that meet this accreditation requirement. This study aims to assess for content validity of these supplemental milestones using a similar methodology to that of the original EM Milestones validation study. Methods A panel of EM program directors (PD) and content experts at two institutions identified domains of additional training not covered by the existing EM Milestones. This led to the development of six novel subcompetencies: “Operations and Administration,” “Critical Care,” “Leadership and Management,” “Research,” “Teaching and Learning,” and “Career Development.” Subject-matter experts at other 48-month EM residency programs refined the milestones for these subcompetencies. PDs of all 48-month EM programs were then asked to order the proposed milestones using the Dreyfus model of skill acquisition for each subcompetency. Data analysis mirrored that used in the original EM Milestones validation study, leading to the final version of our supplemental milestones. Results Twenty of 33 subjects (58.8%) completed the study. No subcompetency or individual milestone met deletion criteria. Of the 97 proposed milestones, 67 (69.1%) required no further editing and remained at the same level as proposed by the study authors. Thirty milestones underwent level changes: 15 (15.5%) were moved one level up and 13 (13.4%) were moved one level down. One milestone (1.0%) in “Leadership and Management” was moved two levels up, and one milestone in “Operations and Administration” was moved two levels down. One milestone in “Research” was ranked by the survey respondents at one level higher than that proposed by the authors; however, this milestone was kept at its original level assignment. Conclusion Six additional subcompetencies were generated and assessed for content validity using the same methodology as was used to validate the current EM Milestones. These optional milestones may serve as an additional set of assessment tools that will allow EM residency programs to report these additional educational outcomes using a familiar milestone rubric.
AEM Education and Training | 2017
Dave W. Lu; Scott M. Dresden; D. Mark Courtney; David H. Salzman
Burnout is prevalent among emergency medicine (EM) physicians, with physicians experiencing burnout more likely to report committing medical errors or delivering suboptimal care. The relationship between physician burnout and identifiable differences in clinical care, however, remains unclear. We examined if EM trainee burnout was associated with differences in clinical performance using high‐fidelity simulation as a proxy for patient care.
MedEdPORTAL Publications | 2017
Trevor J. Barnum; David H. Salzman; David D. Odell; Elizabeth Even; Anna Reczynski; Julia Corcoran; Amy L. Halverson
Introduction The operating room is a complex environment in which individual team members perform specific tasks according to their role. A simulation activity was created to introduce medical students on the surgery clerkship to issues relating to patient safety, infection control, and regulatory requirements. Methods This activity takes place prior to general surgery rotation operative experiences, and addresses the need for students to practice roles they will perform while participating in patient care. The activity includes a simulated operation, an assessment, and a scripted debriefing. Among other tasks, students practice safe patient transfer and monitoring, donning sterile garb, preparing the surgical site, and being active participants in a sign-in and time-out. Students are assessed on assigned tasks, their ability to maintain sterility, and the degree to which they engage with their team. Results Students reported the simulation helped them better understand how they could become involved on their first day in the operating room. Students also reported they were more confident when in the operating room. This finding also extended to students who had previously been in the operating room during a prior OB/GYN rotation. Discussion Patient safety is paramount when in the operating room, and this simulation activity fills a current gap in students practical knowledge as they prepare to enter their surgery clerkship. Giving medical students the information and skills needed to be safe and effective members of the operating team prior to entering the operating room is of benefit to the surgical team, students, and patients.
Advances in Simulation | 2017
David H. Salzman; Diane B. Wayne; Walter Eppich; Eric S. Hungness; Mark Adler; Christine S. Park; Katherine A. Barsness; William C. McGaghie; Jeffrey H. Barsuk
This article describes the development, implementation, and modification of an institutional process to evaluate and fund graduate medical education simulation curricula. The goals of this activity were to (a) establish a standardized mechanism for proposal submission and evaluation, (b) identify simulation-based medical education (SBME) curricula that would benefit from mentored improvement before implementation, and (c) ensure that funding decisions were fair and defensible. Our intent was to develop a process that was grounded in sound educational principles, allowed for efficient administrative oversight, ensured approved courses were high quality, encouraged simulation education research and scholarship, and provided opportunities for medical specialties that had not previously used SBME to receive mentoring and faculty development.
Archive | 2010
David H. Salzman; Michael A. Gisondi
American Journal of Medical Quality | 2015
Paul Jansson; Yuemi An-Grogan; Susan Eller; Donna M. Woods; Amy V. Kontrick; David H. Salzman
Annals of Emergency Medicine | 2012
David H. Salzman; N. Hartman; M. Marinelli; N. Olson; M. Patton; Amer Z. Aldeen