Viva J. Siddall
Northwestern University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Viva J. Siddall.
Journal of General Internal Medicine | 2006
Diane B. Wayne; John Butter; Viva J. Siddall; Monica J. Fudala; Leonard D. Wade; Joe Feinglass; William C. McGaghie
AbstractBACKGROUND: Internal medicine residents must be competent in advanced cardiac life support (ACLS) for board certification. OBJECTIVE: To use a medical simulator to assess postgraduate year 2 (PGY-2) residents’ baseline proficiency in ACLS scenarios and evaluate the impact of an educational intervention grounded in deliberate practice on skill development to mastery standards. DESIGN: Pretest-posttest design without control group. After baseline evaluation, residents received 4, 2-hour ACLS education sessions using a medical simulator. Residents were then retested. Residents who did not achieve a research-derived minimum passing score (MPS) on each ACLS problem had more deliberate practice and were retested until the MPS was reached. PARTICIPANTS: Forty-one PGY-2 internal medicine residents in a university-affiliated program. MEASUREMENTS: Observational checklists based on American Heart Association (AHA) guidelines with interrater and internal consistency reliability estimates; deliberate practice time needed for residents to achieve minimum competency standards; demographics; United States Medical Licensing Examination Step 1 and Step 2 scores; and resident ratings of program quality and utility. RESULTS: Performance improved significantly after simulator training. All residents met or exceeded the mastery competency standard. The amount of practice time needed to reach the MPS was a powerful (negative) predictor of posttest performance. The education program was rated highly. CONCLUSIONS: A curriculum featuring deliberate practice dramatically increased the skills of residents in ACLS scenarios. Residents needed different amounts of training time to achieve minimum competency standards. Residents enjoy training, evaluation, and feedback in a simulated clinical environment. This mastery learning program and other competency-based efforts illustrate outcome-based medical education that is now prominent in accreditation reform of residency education.
Teaching and Learning in Medicine | 2005
Diane B. Wayne; John Butter; Viva J. Siddall; Monica J. Fudala; Lee A. Linquist; Joe Feinglass; Leonard D. Wade; William C. McGaghie
Background: Internal medicine residents must be competent in Advanced Cardiac Life Support (ACLS) for board certification. Purpose: The purpose was to use a medical simulator to assess baseline proficiency in ACLS and determine the impact of an intervention on skill development. Method: This was a randomized trial with wait-list controls. After baseline evaluation in all residents, the intervention group received 4 education sessions using a medical simulator. All residents were then retested. After crossover, the wait-list group received the intervention, and residents were tested again. Performance was assessed by comparison to American Heart Association guidelines for treatment of ACLS conditions with interrater and internal consistency reliability estimates. Results: Performance improved significantly after simulator training. No improvement was detected as a function of clinical experience alone. The educational program was rated highly.
Academic Medicine | 2006
Diane B. Wayne; Viva J. Siddall; John Butter; Monica J. Fudala; Leonard D. Wade; Joe Feinglass; William C. McGaghie
Background Internal medicine residents must be competent in Advanced Cardiac Life Support (ACLS) for board certification. Traditional ACLS courses have limited ability to enable residents to achieve and maintain skills. Educational programs featuring reliable measurements and improved retention of skills would be useful for residency education. Method We developed a training program using a medical simulator, small-group teaching and deliberate practice. Residents received traditional ACLS education and subsequently participated in four two-hour educational sessions using the simulator. Resident performance in six simulated ACLS scenarios was assessed using a standardized checklist. Results After the program, resident ACLS skill improved significantly. The cohort was followed prospectively for 14 months and the skills did not decay. Conclusions Use of a simulation-based educational program enabled us to achieve and maintain high levels of resident performance in simulated ACLS events. Given the limitations of traditional methods to train, assess and maintain competence, simulation technology can be a useful adjunct in high-quality ACLS education.
Chest | 2009
William C. McGaghie; Viva J. Siddall; Paul E. Mazmanian; Janet Myers
BACKGROUND Simulation technology is widely used in undergraduate and graduate medical education as well as for personnel training and evaluation in other healthcare professions. Simulation provides safe and effective opportunities for learners at all levels to practice and acquire clinical skills needed for patient care. A growing body of research evidence documents the utility of simulation technology for educating healthcare professionals. However, simulation has not been widely endorsed or used for continuing medical education (CME). METHODS This article reviews and evaluates evidence from studies on simulation technology in undergraduate and graduate medical education and addresses its implications for CME. RESULTS The Agency for Healthcare Research and Quality Evidence Report suggests that simulation training is effective, especially for psychomotor and communication skills, but that the strength of the evidence is low. In another review, the Best Evidence Medical Education collaboration supported the use of simulation technology, focusing on high-fidelity medical simulations under specific conditions. Other studies enumerate best practices that include mastery learning, deliberate practice, and recognition and attention to cultural barriers within the medical profession that present obstacles to wider use of this technology. CONCLUSIONS Simulation technology is a powerful tool for the education of physicians and other healthcare professionals at all levels. Its educational effectiveness depends on informed use for trainees, including providing feedback, engaging learners in deliberate practice, integrating simulation into an overall curriculum, as well as on the instruction and competence of faculty in its use. Medical simulation complements, but does not replace, educational activities based on real patient-care experiences.
Anesthesiology | 2006
Barbara M. Scavone; Michele T. Sproviero; Robert J. McCarthy; Cynthia A. Wong; John T. Sullivan; Viva J. Siddall; Leonard D. Wade
Background:The decrease in the percentage of patients having cesarean delivery during general anesthesia has led some educators to advocate the increased use of simulation-based training for this anesthetic. The authors developed a scoring system to measure resident performance of this anesthetic on the human patient simulator and subjected the system to tests of validity and reliability. Methods:A modified Delphi technique was used to achieve a consensus among several experts regarding a standardized scoring system for evaluating resident performance of general anesthesia for emergency cesarean delivery on the human patient simulator. Eight third-year and eight first-year anesthesiology residents performed the scenario and were videotaped and scored by four attending obstetric anesthesiologists. Results:Third-year residents scored an average of 150.5 points, whereas first-year residents scored an average of 128 points (P = 0.004). The scoring instrument demonstrated high interrater reliability with an intraclass correlation coefficient of 0.97 (95% confidence interval, 0.94–0.99) compared with the average score. Conclusions:The developed scoring tool to measure resident performance of general anesthesia for emergency cesarean delivery on the patient simulator seems both valid and reliable in the context in which it was tested. This scoring system may prove useful for future studies such as those investigating the effect of simulator training on objective assessment of resident performance.
Academic Medicine | 2005
Diane B. Wayne; Monica J. Fudala; John Butter; Viva J. Siddall; Joe Feinglass; Leonard D. Wade; William C. McGaghie
Background This study used the Angoff and Hofstee standard-setting methods to derive minimum passing scores for six advanced cardiac life support (ACLS) procedures. Method An expert panel provided item-based (Angoff) and group-based (Hofstee) judgments about six ACLS performance checklists on two occasions separated by ten weeks. Interrater reliabilities and test-retest reliability (stability) of the judgments were calculated. Derived ACLS passing standards are compared to historical ACLS performance data from two groups of ACLS-trained internal medicine residents. Results Both the Angoff and Hofstee standard-setting methods produced reliable and stable data. Hofstee minimum passing scores (MPSs) were uniformly more stringent than Angoff MPSs. Interpretation of historical ACLS performance data from medical residents shows the MPSs derived in this study would yield higher-than-expected failure rates. Conclusion Systematic standard setting for ACLS procedures is a necessary step toward the creation of mastery learning educational programs.
Medical Teacher | 2006
Diane B. Wayne; John Butter; Viva J. Siddall; Monica J. Fudala; Leonard D. Wade; Joe Feinglass; William C. McGaghie
Internal medicine residents in the US must be competent to perform procedures including Advanced Cardiac Life Support (ACLS) to become board-eligible. Our aim was to determine if residents near graduation could assess their skills in ACLS procedures accurately. Participants were 40 residents in a university-based training program. Self-assessments of confidence in managing six ACLS scenarios were measured on a 0 (very low) to 100 (very high) scale. These were compared to reliable observational ratings of residents’ performance on a high-fidelity simulator using published treatment protocols. Residents expressed strong self-confidence about managing the scenarios. Residents’ simulator performance varied widely (range from 45% to 94%). Self-confidence assessments correlated poorly with performance (median r = 0.075). Self-assessment of performance by graduating internal medicine residents was not accurate in this study. The use of self-assessment to document resident competence in procedures such as ACLS is not a proxy for objective evaluation.
Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2008
Yue Ming Huang; Jose F. Pliego; Bernadette Henrichs; Mark W. Bowyer; Viva J. Siddall; William C. McGaghie; Daniel B. Raemer
The Society for Simulation in Healthcare convened the second Simulation Education Summit meeting in October 2007 in Chicago, Illinois. The purpose of the Summit was to bring together leaders of public, private, and government organizations, associations, and agencies involved in healthcare education for a focused discussion of standards for simulation-based applications. Sixty-eight participants representing 36 organizations discussed in structured small and large groups the criteria needed for various training and assessment applications using simulation. Although consensus was reached for many topics, there were also areas that required further thought and dialogue. This article is a summary of the results of these discussions along with a preliminary draft of a guideline for simulation-based education.
Ambulatory Pediatrics | 2007
Mark Adler; Jennifer Trainor; Viva J. Siddall; William C. McGaghie
MedEdPORTAL Publications | 2009
Diane B. Wayne; Matthew Nitzberg; Sangeetha Reddy; Rozanna Chester; Leonard D. Wade; Aashish Didwania; John Butter; Viva J. Siddall