Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Amanda R. Burden is active.

Publication


Featured researches published by Amanda R. Burden.


Journal of Continuing Education in The Health Professions | 2012

Simulation for maintenance of certification in anesthesiology: The first two years†

William R. McIvor; Amanda R. Burden; Matthew B. Weinger; Randolph H. Steadman

&NA; The ultimate goal of physician education is the application of knowledge and skills to patient care. The Maintenance of Certification (MOC) for Anesthesiologists program incorporates mannequin‐based simulation to help realize this goal. Results from the first 2 years of experience suggest that 583 physician participants transferred knowledge and skills from their simulated experiences into real‐world practice. Participants consistently found the experience educationally valuable and clinically relevant, and reported that it led to changes in practice. This first experience with mannequin‐based simulation for MOC indicates that physicians accept this teaching modality, many with enthusiasm. Simulation education addresses many of the identified intentions of current continuing medical education (CME) and can help educators realize goals for educating physician‐learners.


Journal of Clinical Anesthesia | 2012

Prevention of central venous catheter-related bloodstream infections: is it time to add simulation training to the prevention bundle?☆☆☆

Amanda R. Burden; Marc C. Torjman; George E. Dy; Jonathan Douglas Jaffe; Jeffrey J. Littman; Fiorella Nawar; S. Sujanthy Rajaram; Christa Schorr; Gregory W. Staman; Annette C. Reboli

STUDY OBJECTIVE To study the impact of adding simulation-based education to the pre-intervention mandatory hospital efforts aimed at decreasing central venous catheter-related blood stream infections (CRBSI) in intensive care units (ICU). DESIGN Pre- and post-intervention retrospective observational investigation. SETTING 24-bed ICU and a 562-bed university-affiliated, urban teaching hospital. PATIENTS ICU patients July 2004-June 2008 were studied for the development of central venous catheter related blood stream infections (CRBSI). MEASUREMENTS ICU patients from July 2004-June 2008 were studied for the development of central venous catheter-related blood stream infections (CRBSI). PRE-INTERVENTION: mandatory staff and physician education began in 2004 to reduce CRBSI. The CRBSI-prevention program included online and didactic courses, and a pre- and post-test. Elements in the pre-intervention efforts included hand hygiene, full barrier precautions, use of Chlorhexidine skin preparation, and mask, gown, gloves, and hat protection for operators. A catheter-insertion cart containing all supplies and checklist were was a mandatory element of this program; a nurse was empowered to stop the procedure for non-performance of checklist items. INTERVENTION As of July 1, 2006, a mandatory simulation-based program for all intern, resident, and fellow physicians was added to teach central venous catheter (CVC) insertion. MEASUREMENTS Data collected pre- and post-intervention were CRBSI incidence, number of ICU catheter days, mortality, laboratory pathogen results, and costs. MAIN RESULTS The pre-intervention CRBSI incidence of 6.47/1,000 catheter days was reduced significantly to 2.44/1,000 catheter days post-intervention (58%; P < 0.05), resulting in a


Anesthesiology | 2015

Practice Improvements Based on Participation in Simulation for the Maintenance of Certification in Anesthesiology Program

Randolph H. Steadman; Amanda R. Burden; Yue Ming Huang; David M. Gaba; Jeffrey B. Cooper

539,902 savings (USD; 47%), and was attributed to shorter ICU and hospital lengths of stay. CONCLUSIONS Following simulation-based CVC program implementation, CRBSI incidence and costs were significantly reduced for two years post-intervention.


Anesthesiology | 2014

This is not a test!: Misconceptions surrounding the maintenance of certification in anesthesiology simulation course.

Matthew B. Weinger; Amanda R. Burden; Randolph H. Steadman; David M. Gaba

Background: This study describes anesthesiologists’ practice improvements undertaken during the first 3 yr of simulation activities for the Maintenance of Certification in Anesthesiology Program. Methods: A stratified sampling of 3 yr (2010–2012) of participants’ practice improvement plans was coded, categorized, and analyzed. Results: Using the sampling scheme, 634 of 1,275 participants in Maintenance of Certification in Anesthesiology Program simulation courses were evaluated from the following practice settings: 41% (262) academic, 54% (339) community, and 5% (33) military/other. A total of 1,982 plans were analyzed for completion, target audience, and topic. On follow-up, 79% (1,558) were fully completed, 16% (310) were partially completed, and 6% (114) were not completed within the 90-day reporting period. Plans targeted the reporting individual (89% of plans) and others (78% of plans): anesthesia providers (50%), non-anesthesia physicians (16%), and non-anesthesia non-physician providers (26%). From the plans, 2,453 improvements were categorized as work environment or systems changes (33% of improvements), teamwork skills (30%), personal knowledge (29%), handoff (4%), procedural skills (3%), or patient communication (1%). The median word count was 63 (interquartile range, 30 to 126) for each participant’s combined plans and 147 (interquartile range, 52 to 257) for improvement follow-up reports. Conclusions: After making a commitment to change, 94% of anesthesiologists participating in a Maintenance of Certification in Anesthesiology Program simulation course successfully implemented some or all of their planned practice improvements. This compares favorably to rates in other studies. Simulation experiences stimulate active learning and motivate personal and collaborative practice improvement changes. Further evaluation will assess the impact of the improvements and further refine the program.


Anesthesiology | 2017

Simulation-based Assessment of the Management of Critical Events by Board-certified Anesthesiologists.

Matthew B. Weinger; Arna Banerjee; Amanda R. Burden; William R. McIvor; John R. Boulet; Jeffrey B. Cooper; Randolph H. Steadman; Matthew S. Shotwell; Jason Slagle; Samuel DeMaria; Laurence C. Torsher; Elizabeth Sinz; Adam I. Levine; John P. Rask; Fred Davis; Christine S. Park; David M. Gaba

655 September 2014 T Maintenance of Certification in Anesthesiology (MOCA®) Simulation Course* is an important element of Part IV (Practice Performance Assessment and Improvement) of the American Board of Anesthesiology’s MOCA program.† These Courses are offered at endorsed programs that form the American Society of Anesthesiologists’ (ASA) Simulation Education Network. Although the MOCA Simulation Course has been described previously,1 discussions with ASA members suggest that misunderstandings remain about several aspects of the MOCA Simulation Course (the “Course”) and its place in the overall MOCA program. We would like to clarify the nature and conduct of the MOCA Simulation Courses vis-à-vis the goal of honing the skills of board-certified anesthesiologists (“Anesthesiologists”). Although the American Board of Anesthesiology sets the requirement for the MOCA program, the ASA’s Simulation Editorial Board (SEB) is responsible for overseeing the content and conduct of the simulation experiences. The SEB has established core Course requirements but provides latitude to the endorsed centers to “do what they do best” and to determine their own course scheduling and fees. The Course is an interactive experience designed to stimulate participants to create and subsequently engage in meaningful practice improvement activities. It is a 6to 8-h immersive learning experience, held in an ASA-endorsed simulation center, that focuses on the management of challenging clinical events. The Course must address both the medical and technical skills of managing acute perioperative situations as well as the nontechnical skills of dynamic decision making and team management. A Course goal is to help participants identify possible system issues and approaches to improve patient care in their individual practices. Every participant is the primary anesthesiologist in at least one simulation scenario. During a scenario, they work with other participants as well as with role-playing instructors or staff as a clinical management team. Each scenario is followed by a detailed instructor-facilitated debriefing where participants reflect on what transpired and articulate lessons to improve their own practices. To achieve endorsement, among other criteria, a center must describe its various Course policies (e.g., confidentiality and cancelation) and provide evidence that its instructors can conduct simulations and debriefings of experienced clinicians with skill and sensitivity. We emphasize that the MOCA Simulation Course is NOT A TEST. There are no individual or team scores or performance evaluations. Debriefing discussions address practice improvement, focusing on what lessons can be drawn from the scenario, and how they can be applied to actual patient care. The Course provides an opportunity for each participant to reflect on their own performance, and that of their peers, with constructive feedback from the instructors and other course participants. The MOCA Simulation Course culminates in the creation of practice improvement plans by participants, to be


BMJ Simulation and Technology Enhanced Learning | 2017

Performance gaps and improvement plans from a 5-hospital simulation programme for anaesthesiology providers: a retrospective study

Samuel DeMaria; Adam C. Levine; Philip Petrou; David L. Feldman; Patricia Kischak; Amanda R. Burden; Andrew Goldberg

Background: We sought to determine whether mannequin-based simulation can reliably characterize how board-certified anesthesiologists manage simulated medical emergencies. Our primary focus was to identify gaps in performance and to establish psychometric properties of the assessment methods. Methods: A total of 263 consenting board-certified anesthesiologists participating in existing simulation-based maintenance of certification courses at one of eight simulation centers were video recorded performing simulated emergency scenarios. Each participated in two 20-min, standardized, high-fidelity simulated medical crisis scenarios, once each as primary anesthesiologist and first responder. Via a Delphi technique, an independent panel of expert anesthesiologists identified critical performance elements for each scenario. Trained, blinded anesthesiologists rated video recordings using standardized rating tools. Measures included the percentage of critical performance elements observed and holistic (one to nine ordinal scale) ratings of participant’s technical and nontechnical performance. Raters also judged whether the performance was at a level expected of a board-certified anesthesiologist. Results: Rater reliability for most measures was good. In 284 simulated emergencies, participants were rated as successfully completing 81% (interquartile range, 75 to 90%) of the critical performance elements. The median rating of both technical and nontechnical holistic performance was five, distributed across the nine-point scale. Approximately one-quarter of participants received low holistic ratings (i.e., three or less). Higher-rated performances were associated with younger age but not with previous simulation experience or other individual characteristics. Calling for help was associated with better individual and team performance. Conclusions: Standardized simulation-based assessment identified performance gaps informing opportunities for improvement. If a substantial proportion of experienced anesthesiologists struggle with managing medical emergencies, continuing medical education activities should be reevaluated.


Anesthesiology | 2017

Frequency and Type of Situational Awareness Errors Contributing to Death and Brain Damage: A Closed Claims Analysis

Christian Schulz; Amanda R. Burden; Karen L. Posner; Shawn Mincer; Randolph H. Steadman; Klaus Wagner; Karen B. Domino

Background Simulation is increasingly employed in healthcare provider education, but usage as a means of identifying system-wide practitioner gaps has been limited. We sought to determine whether practice gaps could be identified, and if meaningful improvement plans could result from a simulation course for anaesthesiology providers. Methods Over a 2-year cycle, 288 anaesthesiologists and 67 certified registered nurse anaesthetists (CRNAs) participated in a 3.5 hour, malpractice insurer-mandated simulation course, encountering 4 scenarios. 5 anaesthesiology departments within 3 urban academic healthcare systems were represented. A real-time rater scored each individual on 12 critical performance items (CPIs) representing learning objectives for a given scenario. Participants completed a course satisfaction survey, a 1-month postcourse practice improvement plan (PIP) and a 6-month follow-up survey. Results All recorded course data were retrospectively reviewed. Course satisfaction was generally positive (88–97% positive rating by item). 4231 individual CPIs were recorded (of a possible 4260 rateable), with a majority of participants demonstrating remediable gaps in medical/technical and non-technical skills (97% of groups had at least one instance of a remediable gap in communication/non-technical skills during at least one of the scenarios). 6 months following the course, 91% of respondents reported successfully implementing 1 or more of their PIPs. Improvements in equipment/environmental resources or personal knowledge domains were most often successful, and several individual reports demonstrated a positive impact on actual practice. Conclusions This professional liability insurer-initiated simulation course for 5 anaesthesiology departments was feasible to deliver and well received. Practice gaps were identified during the course and remediation of gaps, and/or application of new knowledge, skills and resources was reported by participants.


Anesthesiology Clinics | 2018

Use of Simulation in Performance Improvement

Amanda R. Burden; Erin W. Pukenas

BACKGROUND Situational awareness errors may play an important role in the genesis of patient harm. The authors examined closed anesthesia malpractice claims for death or brain damage to determine the frequency and type of situational awareness errors. METHODS Surgical and procedural anesthesia death and brain damage claims in the Anesthesia Closed Claims Project database were analyzed. Situational awareness error was defined as failure to perceive relevant clinical information, failure to comprehend the meaning of available information, or failure to project, anticipate, or plan. Patient and case characteristics, primary damaging events, and anesthesia payments in claims with situational awareness errors were compared to other death and brain damage claims from 2002 to 2013. RESULTS Anesthesiologist situational awareness errors contributed to death or brain damage in 198 of 266 claims (74%). Respiratory system damaging events were more common in claims with situational awareness errors (56%) than other claims (21%, P < 0.001). The most common specific respiratory events in error claims were inadequate oxygenation or ventilation (24%), difficult intubation (11%), and aspiration (10%). Payments were made in 85% of situational awareness error claims compared to 46% in other claims (P = 0.001), with no significant difference in payment size. Among 198 claims with anesthesia situational awareness error, perception errors were most common (42%), whereas comprehension errors (29%) and projection errors (29%) were relatively less common. CONCLUSIONS Situational awareness error definitions were operationalized for reliable application to real-world anesthesia cases. Situational awareness errors may have contributed to catastrophic outcomes in three quarters of recent anesthesia malpractice claims.Situational awareness errors resulting in death or brain damage remain prevalent causes of malpractice claims in the 21st century.


Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2013

Cardiac Arrest in an Obstetric Patient: A Simulated Emergency

Amanda R. Burden; Greg Staman; Erin W. Pukenas

Human error and system failures continue to play a substantial role in preventable errors that lead to adverse patient outcomes or death. Many of these deaths are not the result of inadequate medical knowledge and skill, but occur because of problems involving communication and team management. Anesthesiologists pioneered the use of simulation for medical education in an effort to improve physician performance and patient safety. This article explores the use of simulation for performance improvement. Educational theories that underlie effective simulation programs are described as driving forces behind the advancement of simulation in performance improvement.


Archive | 1993

Crisis Management in Anesthesiology

David M. Gaba; Dean for Immersive; Simulation-based Learning; Stanford; Staff Anesthesiologist; Palo Alto; Kevin J. Fish; Chb; Per Diem Staff Anethesiologist; Anethesiology; Steven K. Howard; Amanda R. Burden; Simulation Program; Camden; New Jersey.

DEMOGRAPHICS Case Title: Cardiac Arrest in an Obstetric Patient: A Simulated Emergency Patient Name: Jane Winters Case Description and Diagnosis: A healthy 32-year-old pregnant woman develops pulseless electrical activity (PEA) arrest at 33-weeks gestation. Simulation Scenario Developers: Amanda Burden, MD; Greg Staman, RN; and Erin Pukenas, MD Target Audience: Anesthesia and obstetrics (OB) and gynecology, as well as certified registered nurse anesthetists

Collaboration


Dive into the Amanda R. Burden's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marc C. Torjman

Thomas Jefferson University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gregory W. Staman

Cooper University Hospital

View shared research outputs
Top Co-Authors

Avatar

Samuel DeMaria

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge