Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Angela Blood is active.

Publication


Featured researches published by Angela Blood.


Neurology | 2015

Neurology objective structured clinical examination reliability using generalizability theory

Angela Blood; Yoon Soo Park; Rimas V. Lukas; James R. Brorson

Objectives: This study examines factors affecting reliability, or consistency of assessment scores, from an objective structured clinical examination (OSCE) in neurology through generalizability theory (G theory). Methods: Data include assessments from a multistation OSCE taken by 194 medical students at the completion of a neurology clerkship. Facets evaluated in this study include cases, domains, and items. Domains refer to areas of skill (or constructs) that the OSCE measures. G theory is used to estimate variance components associated with each facet, derive reliability, and project the number of cases required to obtain a reliable (consistent, precise) score. Results: Reliability using G theory is moderate (Φ coefficient = 0.61, G coefficient = 0.64). Performance is similar across cases but differs by the particular domain, such that the majority of variance is attributed to the domain. Projections in reliability estimates reveal that students need to participate in 3 OSCE cases in order to increase reliability beyond the 0.70 threshold. Conclusions: This novel use of G theory in evaluating an OSCE in neurology provides meaningful measurement characteristics of the assessment. Differing from prior work in other medical specialties, the cases students were randomly assigned did not influence their OSCE score; rather, scores varied in expected fashion by domain assessed.


Journal of the Neurological Sciences | 2017

Ambulatory training in neurology education

Rimas V. Lukas; Angela Blood; James R. Brorson; Dara V. Albert

Much of the care provided by practicing neurologists takes place in outpatient clinics. However, neurology trainees often have limited exposure to this setting. Adequate incorporation of outpatient care in neurology training is vital; however it is often hampered by numerous challenges. We detail a number of these challenges and suggest potential means for improvement.


Journal of Clinical Neuroscience | 2016

Breadth versus volume: Neurology outpatient clinic cases in medical education

Dara V. Albert; Angela Blood; Yoon Soo Park; James R. Brorson; Rimas V. Lukas

This study examined how volume in certain patient case types and breadth across patient case types in the outpatient clinic setting are related to Neurology Clerkship student performance. Case logs from the outpatient clinic experience of 486 students from The University of Chicago Pritzker School of Medicine, USA, participating in the 4week Neurology Clerkship from July 2008 to June 2013 were reviewed. A total of 12,381 patient encounters were logged and then classified into 13 diagnostic categories. How volume of cases within categories and the breadth of cases across categories relate to the National Board of Medical Examiners Clinical Subject Examination for Neurology and a Neurology Clerkship Objective Structured Clinical Examination was analyzed. Volume of cases was significantly correlated with the National Board of Medical Examiners Clinical Subject Examination for Neurology (r=.290, p<.001), the Objective Structured Clinical Examination physical examination (r=.236, p=.011), and the Objective Structured Clinical Examination patient note (r=.238, p=.010). Breadth of cases was significantly correlated with the National Board of Medical Examiners Clinical Subject Examination for Neurology (r=.231, p=.017), however was not significantly correlated with any component of the Objective Structured Clinical Examination. Volume of cases correlated with higher performance on measures of specialty knowledge and clinical skill. Fewer relationships emerged correlating breadth of cases and performance on the same measures. This study provides guidance to educators who must decide how much emphasis to place on volume versus breadth of cases in outpatient clinic learning experiences.


Journal of Clinical Neuroscience | 2014

Assessment of neurological clinical management reasoning in medical students

Rimas V. Lukas; Angela Blood; Yoon Soo Park; James R. Brorson

In neurology education there is evidence that trainees may have greater ability in general localization and diagnosis than they do in treatment decisions, particularly with considering longer term care and supportive care. We hypothesized that medical students completing a neurology clerkship would exhibit greater skill at considering the acute diagnostic and therapeutic management than at considering supportive management measures. Data from 720 standardized patient encounters by 360 medical students completing a neurology clerkship being evaluated via an objective structured clinical examination were analyzed for skill in three components of clinical decision making: diagnostic evaluation, therapeutic intervention, and supportive intervention. Scores for all standardized patient encounters over the 2008-2012 interval revealed a significantly higher percentage of correct responses in both the diagnostic (mean [M]=62.6%, standard deviation [SD]=20.3%) and therapeutic (M=63.0%, SD=28.8%) categories in comparison to the supportive (M=31.8%, SD=45.2%) category. However, only scores in therapeutic and supportive treatment plans were found to be significant predictors of the USA National Board of Medical Examiners (NBME) clinical neurology subject examination scores; on average, a percent increase in therapeutic and support scores led to 5 and 2 point increases in NBME scores, respectively. We demonstrate empirical evidence of deficits in a specific component of clinical reasoning in medical students at the completion of a neurology clerkship.


Hypertension in Pregnancy | 2015

A modified Delphi method to create a scoring system for assessing team performance during maternal cardiopulmonary arrest

Jennifer M. Banayan; Angela Blood; Yoon Soo Park; Sajid Shahul; Barbara M. Scavone

Background: Maternal cardiopulmonary arrest is a rare but often fatal emergency. The authors used a modified Delphi method to create a checklist of tasks for practitioners. Methods: Within each round, experts ranked tasks on a scale from zero through five. Consensus was defined a priori as 80% exact agreement. Results: Three rounds were required to achieve consensus resulting in a checklist of 45 tasks. Round One results revealed five tasks, Round Two included 25 tasks, and Round Three resulted in 29 tasks with 80% exact agreement. Conclusions: The modified Delphi method resulted in a weighted scoring system that can be used to objectively assess team performance.


Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2013

Board 438 - Research Abstract The Use of Simulation to Teach Professionalism in Graduate Medical Education: A Systematic Review of the Literature (Submission #908)

Eisha Wali; Angela Blood; Jayant M. Pinto; Stephen D. Small; Elizabeth A. Blair

Introduction/Background The Accreditation Council for Graduate Medical Education (ACGME) requires that residency programs ensure that trainees receive appropriate training in professionalism.1 However, surveys suggest that current medical education in professionalism is inadequate.2,3 Simulation has been applied to competency-based training in various forms and settings4-6 but there is little available discussion of its application to professionalism curricula,4,5 although a role or simulation has been acknowledged in this domain. Although several graduate medical education (GME) programs have utilized simulation to teach professionalism, to date there has not been a comprehensive literature review that specifically examines methodology used10-13 or provided details on how to best accomplish this on a practical basis.7-9 Here, we provide a resource that identifies current, effective training Methods in simulation-based professionalism education and critically explore areas for future improvement in this field. Methods Using standard protocols for performing systematic reviews,14,15 a search of the English literature for the terms professionalism and simulation was conducted in five databases: PubMed, EMBASE, Web of Science, PsychINFO and OVID across all available publication dates with a clinical librarian. Pearl culturing using subject headings of relevant articles found in PubMed and EMBASE (MeSH and EMTREE terms respectively) guided further searches in those databases to find additional articles to ensure that relevant articles not indexed were included. Two independent raters with experience in simulation and medical education screened abstracts according to the following criteria: 1) The study focused on GME; 2) A detailed simulation method was provided; 3) Training or evaluating professionalism (by ACGME definition) was a specific objective of the simulation. Full text articles were referred to if the abstract was ambiguous. A third rater resolved disagreements between the initial raters. Simulation was defined broadly as a representation or imitation of a clinical interaction in which the trainee is an active participant (e.g., computer-based simulation, standardized patients, computerized manikins). Articles that met criteria were then reviewed in detail by two independent raters by the following criteria in order to glean current best practices: 1) How was professionalism defined? Was a framework for teaching or assessing professionalism included?; 2) Was professionalism the primary objective of the curriculum, or was professionalism incorporated into an existing curriculum focused on another topic?; 3.) What modality of simulation was utilized?; 4) What other educational strategies made up the curriculum?; 5) Was professionalism assessed using simulation? Were learner outcomes reported? Results A total of 695 articles were identified. Analysis of these revealed that there are three primary themes to consider when implementing a simulation-based professionalism curriculum: 1) Simulation for teaching versus assessment purposes, 2) Simulation incorporated into a broader professionalism curriculum, and 3) Professionalism with respect to colleagues and coworkers versus professionalism with respect to patients and families. We plotted the studies the continuum of these themes and discussed the logistics of each approach along with the associated benefits and limitations. Regarding modality, clinical educators are using multiple Methods to teach professionalism including team-based scenarios, computerized manikins and standardized patients. Few publications discuss the incorporation of more than one modality to achieve a given professionalism criterion; however, many educators utilize simulation as an adjunct to other educational strategies such as didactics or case-based learning encounters. Few publications include the scenario materials, thus making validation or reproducibility for other educators difficult. Conclusion Our work answers the question: How are clinical educators in GME utilizing simulation to teach professionalism? By synthesizing experiences of across specialties and institutions into a comprehensive resource, we elucidate current best practices in simulation-based professionalism training. The goal of this work is to encourage the use of this innovative method to fulfill the ACGME goals of improved training in professionalism. References 1. ACGME. Common program requirements. 2013. http://www.acgme-nas.org/assets/pdf/CPR_Categorization_07012013.pdf. Accessed 04/13/13. 2. Francesca Monn M, Wang MH, Gilson MM, Chen B, Kern D, Gearhart SL. ACGME core competency training, mentorship, and research in surgical subspecialty fellowship programs. J Surg Educ. 2013;70(2):180-188. doi: 10.1016/j.jsurg.2012.11.006; 10.1016/j.jsurg.2012.11.006. 3. Kesselheim JC, Sectish TC, Joffe S. Education in professionalism: Results from a survey of pediatric residency program directors. J Grad Med Educ. 2012;4(1):101-105. doi: 10.4300/JGME-D-11-00110.1; 10.4300/JGME-D-11-00110.1. 4. Issenberg SB, Chung HS, Devine LA. Patient safety training simulations based on competency criteria of the accreditation council for graduate medical education. Mt Sinai J Med. 2011;78(6):842-853. doi: 10.1002/msj.20301; 10.1002/msj.20301. 5. Aggarwal R, Mytton OT, Derbrew M, et al. Training and simulation for patient safety. Qual Saf Health Care. 2010;19 Suppl 2:i34-43. doi: 10.1136/qshc.2009.038562; 10.1136/qshc.2009.038562. 6. Son J, Zeidler KR, Echo A, Otake L, Ahdoot M, Lee GK. Teaching core competencies of reconstructive microsurgery with the use of standardized patients. Ann Plast Surg. 2013;70(4):476-481. 7. Jotkowitz AB, Glick S, Porath A. A physician charter on medical professionalism: A challenge for medical education. Eur J Intern Med. 2004;15(1):5-9. doi: 10.1016/j.ejim.2003.11.002. 8. Mueller PS. Incorporating professionalism into medical education: The mayo clinic experience. Keio J Med. 2009;58(3):133-143. 9. O’Sullivan H, van Mook W, Fewtrell R, Wass V. Integrating professionalism into the curriculum: AMEE guide no. 61. Med Teach. 2012;34(2):e64-77. doi: 10.3109/0142159X.2012.655610; 10.3109/0142159X.2012.655610. 10. Zabar S, Ark T, Gillespie C, et al. Can unannounced standardized patients assess professionalism and communication skills in the emergency department? Acad Emerg Med. 2009;16(9):915-918. doi: 10.1111/j.1553-2712.2009.00510.x; 10.1111/j.1553-2712.2009.00510.x. 11. Schmitz CC, Chipman JG, Luxenberg MG, Beilman GJ. Professionalism and communication in the intensive care unit: Reliability and validity of a simulated family conference. Simul Healthc. 2008;3(4):224-238. doi: 10.1097/SIH.0b013e31817e6149; 10.1097/SIH.0b013e31817e6149. 12. Ponton-Carss A, Hutchison C, Violato C. Assessment of communication, professionalism, and surgical skills in an objective structured performance-related examination (OSPRE): A psychometric study. Am J Surg. 2011;202(4):433-440. doi: 10.1016/j.amjsurg.2010.07.045; 10.1016/j.amjsurg.2010.07.045. 13. Ozuah PO, Reznik M. Using unannounced standardised patients to assess residents’ professionalism. Med Educ. 2008;42(5):532-533. doi: 10.1111/j.1365-2923.2008.03083.x; 10.1111/j.1365-2923.2008.03083.x. 14. Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann Intern Med. 2009;151(4):264-269. http://dx.doi.org/10.7326/0003-4819-151-4-200908180-00135. 15. Cook DA, West CP. Conducting systematic reviews in medical education: A stepwise approach. Med Educ. 2012;46(10):943-952. doi: 10.1111/j.1365-2923.2012.04328.x; 10.1111/j.1365-2923.2012.04328.x. Disclosures Gold Foundation Research Institute Grant.


Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2013

Board 312 - Research Abstract Privileging with Simulation-Based Assessment for Hospital-Based Procedures: A Survey of Current Programs at Simulation Centers in the United States (Submission #911)

Angela Blood; Nilam Soni; Stephen D. Small

Introduction/Background Privileging is the process by which an institution grants permission to a provider to perform patient care activities. When a hospital grants privileges, it “explicitly endows a [provider] with the right to perform that procedure within its physical and administrative confines”.2 The use of simulation-based assessment (SBA) allows providers to practice and be assessed on a predetermined set of procedures in a controlled and standardized setting until acceptable performance is reached.23 As privileging programs shift toward competency-based assessment, simulation is being utilized to both gain and maintain privileges.25 However, the scope of existing privileging programs that utilize SBA and exactly how SBA is incorporated remains unclear. The objective of this study was to survey simulation centers regarding existing and planned privileging programs that incorporate SBA. Methods Ninety two simulation centers were included in the survey. For feasibility, only simulation centers that were accredited by either the American College of Surgeons (ACS) or from the Society for Simulation in Healthcare (SSiH) were included. The survey took approximately 20-30 minutes to complete and was addressed to directors of simulation centers. Survey data was collected from March through April 2013. Results Twenty seven simulation centers responded (29%). The majority of centers were associated with public hospitals (62%); 33% of centers were associated with private hospitals. All were associated with teaching hospitals. A majority of centers had approximately 5,000-10,000 square feet. Regarding the number of full time employees (FTE’s), this ranged widely from 1 – 52. Nineteen procedures were included in the survey; these were determined based on a review of the literature.33 The procedures that were most often selected as having existing SBA privileging programs included central venous catheterization, sedation, laparoscopic surgery and robotic surgery. The procedures that were most often selected for SBA privileging programs in development included GI endoscopy and bronchoscopy procedures. All of the nineteen procedures surveyed had at least one center with either a current or future SBA program in development. Additional procedures centers reported SBA privileging programs included vein graft harvesting and airway management. Reasons for beginning such programs included desire to develop standards, risk management, patient safety, response or cost reduction related to adverse events, targeting of procedures that occur less frequently and cost reduction related to training. The majority of centers required all providers who performed the given procedure to participate in the program (57%). Centers collaborated with human resources (17%), risk management/legal (33%), quality improvement/quality assurance (42%), medical staff office (58%), continuing medical education (42%), and patient safety (25%). A majority of centers utilized their programs for initial privileging of new providers (57%). Top barriers included institutional political support, provider resistance to participate, litigation from providers (i.e. provider sues hospital if denied privileges), and simulation center staffing and equipment. Methods to develop content for SBA privileging programs included internal expert opinion (69%), literature review (54%), and multi-department (46%) or interprofessional (38%) committees. One center reported that they worked with a credentialing committee; no other Methods of choosing criteria were reported. Methods of delivering the SBA privileging program included testing performance on simulation models (77%) and the use of performance-based checklists (62%). None of the centers reported a method for assessing procedural competency. Conclusion It is clear from the Results that simulation centers are developing SBA privileging programs. However, the process is fragmented as centers are developing programs as single institutions. Further research, publication and guidance is needed so that simulation centers can learn from each other regarding best practices when delivering SBA privileging programs. References 1. Achord JL: The credentialing process: Rational decisions of hospital committees for granting of privileges in gastrointestinal endoscopic procedures. American Journal of Gastroenterology 1987; 82(10):1064-1065. 2. AGA policy statement: Hospital credentialing standards for physicians who perform endoscopies. Gastroenterology 1993; 104(5):1563. 3. Aggarwal R, Darzi A: Simulation to Enhance Patient Safety. Chest 2011; 140(4):854-858; doi:10.1378/chest.11-0728. 4. American Society of Anesthesiologists 2006: ASA workgroup on simulation education white paper: ASA approval of anesthesiology simulation programs. ASA web site. http://www.asahq.org/For-Members/Education-and-Events/Simulation-Education.aspx. Accessed July 1, 2013. 5. Castronovo FP: A fluoroscopic credentialing/safety program at a large research hospital. The Radiation Safety Journal 2004; 86(2):S76-S79. 6. Cherry RA, West CE, Hamilton MC, Rafferty CM, Hollenbeak CS, Caputo GM: Reduction of central venous catheter associated blood stream infections following implementation of a resident oversight and credentialing policy. Patient Safety in Surgery 2011; 5(15):1-8. 7. Clemenhagen C: Credentialing procedures need strengthening. Dimensions in health service 1986; 63(5):4-5. 8. Cohen MH, Hrbek A, Davis RB, Schacter SC, Eisenberg DM: Emerging credentialing practices, malpractice liability policies, and guidelines governing complementary and alternative medical practices and dietary supplement recommendations. Archives of Internal Medicine 2005; 165:289-295. 9. Connors JJ: Training, competency, and credentialing standards for carotid stenting. Techniques in Vascular and Interventional Radiology 2005; 7:210-214. 10. Connors JJ, Sacks D, Furlan AJ, Selman WR, Russell EJ, Stieg PE, Group FtNCW: Training, Competency, and Credentialing Standards for Diagnostic Cervicocerebral Angiography, Carotid Stenting, and Cerebrovascular Intervention. Radiology 2005; 234(1):26-34; doi:10.1148/radiol.2341041349. 11. Decker S, Utterback VA, Thomas MB, Mitchell M, Sportsman S: Assessing continued competency through simulation: A call for stringent action. Nursing Education Perspectives 2011; 32(3):120-125. 12. Dent TL: Training, credentialing, and evaluation in laparoscopic surgery. Laparoscopy for the General Surgeon 1992; 72(5):1003-1010. 13. Endoscopy ASfG: Methods of granting hospital privileges to perform gastrointestinal endoscopy. Gastrointestinal Endoscopy 2002; 55(7):780-783. 14. Handly N, Ramoska EA: Opportunities to perform trauma procedures in the state of Pennsylvania: Are there sufficient numbers to develop and maintain competency? Academic Emergency Medicine 2011; 18(5):Suppl 1 (S53). 15. Hershey N: Flawed credentialing procedures create potential for hospital liability. Hospital Law Newsletter 2003; 20(3):6-7. 16. Hobson RW 2nd, Howard VJ, Roubin GS, Ferguson RD, Brott TG, Howard G, Sheffet AJ, Roberts J, Hopkins LN, Moore WS: Credentialing of surgeons as interventionalists for carotid artery stenting: Experience from the lead-in phase of CREST. Journal of Vascular Surgery 2004; 40(5):952-957; doi:10.1016/j.jvs.2004.08.039. 17. Holmboe E, Rizzolo MA, Sachdeva AK, Rosenberg M, Ziv A: Simulation-based assessment and the regulation of healthcare professionals. Simulation in Healthcare 2011; 6(7):S58-S62; doi:10.1097/SIH.1090b1013e3182283bd3182287. 18. Hospital credentialing standards for physicians who perform endoscopies. Gastroenterology 1993; 104(5):1563. 19. Lampotang S: Computer and web-enabled simulations for anesthesiology training and credentialing. Journal of Critical Care 2008; 23:173-178. 20. Ma I, Zalunardo N, Pachev G, Beran T, Brown M, Hatala R, McLaughlin K: Comparing the use of global rating scale with checklists for the assessment of central venous catheterization skills using simulation. Advances in Health Sciences Education 2012; 17(4):457-470. 21. Marco J, Holmes DR Jr: Simulation: Present and future roles. JACC: Cardiovascular Interventions 2008; 1(5):590-592. 22. Mattern CL: Make your credentialing procedures work for you. Trustee 1979; 32(8):13-15. 23. McGaghie WC, Butter J, Kaye M: Observational assessment, Assessment in Health Professions Education. Edited by Downing SM, Yudkowsky R. New York, Routledge, 2009, pp 185-215. 24. Melnick D:TheexperienceoftheNationalBoardofMedicalExaminers, Computer-Based Examinations For BoardCertification. Edited by MancallEL, Vashook PG, DockeryLL. EvanstonIL,AmericanBoardofMedical Specialties, 1996, pp 111-120. 25. Michaelson JD, Manning L: Competency assessment in simulation-based procedural education. American Journal of Surgery 2008; 196(4):609-15. 26. Mislevy R: Evidence-centered design for simulation-based assessment. The National Center for Research on Evaluation, Standards, and Student Testing, CRESST Report 800, 2011. 27. Moss PA, Girard BJ, Haniford LC: Validity in educational assessment, Educational Measurement, 4th edition. Edited by Brennan RL. Review of Research in Education 2006; 30(1):109-162. 28. Priestley S, Babl FE, Krieser D, Law A, Miller J, Spicer M, Tully M: Evaluation of the impact of a paediatric procedural sedation credentialing programme on quality of care. Emergency Medicine Australasia 2006; 18(5-6):498-504. 29. Ramoska EA, Sacchetti AD, Warren TM: Credentialing of emergency physicians-support for delineation of privileges in invasive procedures. American Journal of Emergency Medicine, 1988; 6(3):278-281. 30. Rehrig ST, Powers K, Jones DB: Integrating simulation in surgery as a teaching tool and credentialing standard. Journal of Gastrointestinal Surgery 2008; 12:222-233. 31. Rodriguez-Paz JM, Kennedy M, Salas E, Wu AW, Sexton SB, Hunt EA, Pronovost PJ: Beyond see one, do one, teach one: Toward a different training paradigm. Quality & Safety in Health Care 2009; 18(1):63


Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2013

AWARD WINNER – 4th Place Research Abstract Board 311 - Research Abstract Linking Performance Measurement and Scenario Design: Team Response to Maternal Cardiac Arrest (Submission #407)

Angela Blood; Jennifer M. Banayan; Barbara M. Scavone; Maryam Siddiqui; Ken Nunes

Introduction/Background Maternal cardiac arrest is a rare but often fatal emergency. The incidence of maternal cardiopulmonary arrest has increased in the United Kingdom with the latest figures revealing an incidence of 1:20,000, compared to 1:30,000 found just 3 years prior.1 This is consistent with United States data documenting increases in maternal mortality.2 Because of its rarity, most clinicians have little experience in managing such patients,3 and a variety of publications expose concerning deficits among clinicians in their management of such patients.3,-7 No accepted tool exists to assess performance of practitioners managing these patients. Therefore, the authors used a modified Delphi method8 to capture expert judgment to create a checklist of tasks practitioners should perform during the first five minutes of a maternal cardiac arrest. The objective was to create a weighted scoring system that could be used to quantitatively measure team performance during such emergencies and to use the resulting assessment tool as a curriculum blueprint9 to guide simulation scenario design. Methods After reviewing the literature and pooling internal clinician opinions, the authors created a list of tasks thought to be essential for management of maternal cardiac arrest. The list was distributed to seven recognized experts including obstetricians, anesthesiologists and advanced practice nurses to render judgments regarding appropriate management of maternal cardiac arrest. Within each round, experts ranked tasks on a scale from 0 through 5 (0 as dangerous/inappropriate, 5 extremely important) and could suggest items to add, delete or change. Medians of expert’s ratings were calculated; consensus was defined a priori as 80% exact agreement. Next, an interprofessional task force was formed to design a maternal arrest simulation, with representatives from each profession included in the scenario and clinical educators experienced in simulation. The task force assigned tasks to team members (i.e., determined which roles were appropriate for obstetricians, anesthesiologists, nurses, etc.). The original simulation scenario was adjusted in accordance with the Results of the modified Delphi. Results The original task list contained 48 tasks. Three rounds were required during the Delphi process to achieve consensus. The final assessment tool consisted of a checklist of 45 tasks. The task force assigned 19 tasks to nurses, 11 to obstetricians, 9 to anesthesiologists and 6 to be shared by the team (e.g., remaining in the patient room rather than moving to the operating room for the emergent caesarean delivery). Over 20 sessions, a total of 168 learners participated. Participants included 35 obstetric residents and attendings, 53 anesthesia residents and attendings, 67 nurses, 7 operating room surgical technicians and 6 unit secretarial clerks. Conclusion The modified Delphi method is a valuable tool used to obtain consensus among experts and was used in this study to identify the appropriate management of maternal cardiac arrest. After numerous modifications, edits, deletions and additions that improved the author’s original list of tasks, the process resulted in a weighted scoring system that can be used to objectively assess team performance during maternal cardiac arrest. This new weighted scoring system served as a useful curriculum blueprint for simulation team scenario design and can be used by other teams to design simulation scenarios for this emergency. References 1. Lewis G: The women who died 2006-2008BJOG 2011;118(Suppl1): 30-56 2. Berg CJ, Callaghan WM, Syverson C, Henderson Z: Pregancy-related mortality in the United States, 1998-2005Obstet Gynecol 2010;116:1302-9. 3. Einav S, Matot I, Berkenstadt H, Bromiker R, Weiniger CF: A survey of labour ward clinicians’ knowledge of maternal cardiac arrest and resuscitationInt J Obstet Anesth 2008;17:238-42. 4. Berkenstadt H, Ben-Menchem E, Dach R, Ezri T, Ziv A, Rubin O, Keidan I: Deficits in the provision of cardiopulmonary resuscitation during simulated obstetric crises: Results from the Israeli Board of Anesthesiologists Anesth Analg 2012;115:1122-6. 5. Cohen SE, Andes LC, Carvalho B: Assessment of knowledge regarding cardiopulmonary resuscitation of pregnant womenInt J Obstet Anesth 2008;17:20-5. 6. Fisher N, Eisen LA, Bayya JV, Dulu A, Bernstein PS, Merkatz IR, Goffman D: Improved performance of maternal-fetal medicine staff after maternal cardiac arrest simulation-based trainingAm J Obstet Gynecol 2011;205:239.e1-5. 7. Lipman SS, Daniels KI, Carvalho B, Arafeh J, Harney K, Puck A, Cohen SE, Druzin M: Deficites in the provision of cardiopulmonary resuscitation during simulated obstetric crisesAm J Obstet Gynecol 2010;203:179.e1-5. 8. Clayton MJ: Delphi: A technique to harness expert opinion for critical decision-making tasks in educationEduc Psychol 1997;17:373-86. 9. Thorndike R, Hagen E: Measurement and evaluation in psychology and education, 2nd edition 1991. Oxford, England: Wiley. Disclosures None.


Neurology | 2012

Student assessment by objective structured examination in a neurology clerkship.

Rimas V. Lukas; Taiwo Adesoye; Sandy Smith; Angela Blood; James R. Brorson


Surgery | 2016

Teaching professionalism in graduate medical education: What is the role of simulation?

Eisha Wali; Jayant M. Pinto; Melissa Cappaert; Marcie Lambrix; Angela Blood; Elizabeth A. Blair; Stephen D. Small

Collaboration


Dive into the Angela Blood's collaboration.

Top Co-Authors

Avatar

Yoon Soo Park

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dara V. Albert

Nationwide Children's Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge