Kathryn M. Andolsek
Duke University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kathryn M. Andolsek.
Academic Medicine | 2012
Christopher M. Derienzo; Karen S. Frush; Michael E. Barfield; Priya R. Gopwani; Brian Griffith; Xiaoyin Sara Jiang; Ankit I. Mehta; Paulie Papavassiliou; Kristy L. Rialon; Alyssa Stephany; Tian Zhang; Kathryn M. Andolsek
With changes in the Accreditation Council for Graduate Medical Education (ACGME) Common Program Requirements related to transitions in care effective July 1, 2011, sponsoring institutions and training programs must develop a common structure for transitions in care as well as comprehensive curricula to teach and evaluate patient handoffs. In response to these changes, within the Duke University Health System, the resident-led Graduate Medical Education Patient Safety and Quality Council performed a focused review of the handoffs literature and developed a plan for comprehensive handoff education and evaluation for residents and fellows at Duke. The authors present the results of their focused review, concentrating on the three areas of new ACGME expectations--structure, education, and evaluation--and describe how their findings informed the broader initiative to comprehensively address transitions in care managed by residents and fellows. The process of developing both institution-level and program-level initiatives is reviewed, including the development of an interdisciplinary minimal data set for handoff core content, training and education programs, and an evaluation strategy. The authors believe the final plan fully addresses both Dukes internal goals and the revised ACGME Common Program Requirements and may serve as a model for other institutions to comprehensively address transitions in care and to incorporate resident and fellow leadership into a broad, health-system-level quality improvement initiative.
Critical Care Medicine | 2009
Saumil M. Chudgar; Christopher E. Cox; Loretta G. Que; Kathryn M. Andolsek; Nancy W. Knudsen; Alison S. Clay
Objective:To determine the impact of the Accreditation Council for Graduate Medical Education mandates for duty hours and competencies on instruction, evaluation, and patient care in intensive care units in the United States. Design:A Web-based survey was designed to determine the current methods of teaching and evaluation in the intensive care unit, barriers to changing methods of teaching and evaluation, and the impact of Accreditation Council for Graduate Medical Education regulations on teaching and patient care. Setting:An anonymous Web-based survey was used; cumulative data were analyzed. Subjects:A total of 125 of 380 program directors (33%) for pediatric critical care, pulmonary critical care, anesthesiology critical care, and surgery critical care fellowship programs completed questionnaires. Measurements and Main Results:Bedside case-based teaching and standardized lectures are the most common methods of education in the intensive care unit. Patient safety and resident demands are two factors most likely to result in changes in instruction in the intensive care unit. Barriers to changes in education include clinical workload and lack of protected time and funding. Younger respondents viewed influences to change differently than older respondents. Respondents felt that neither education nor patient care had improved as a result of the Accreditation Council for Graduate Medical Education mandates. Conclusions:Medical education teaching methods and assessment in the intensive care unit have changed little since the initiation of the Accreditation Council for Graduate Medical Education regulations despite respondents’ self-report of a willingness to change. Instead, the Accreditation Council for Graduate Medical Education regulations are thought to have negatively impacted resident attitudes, continuity of care, and even availability for teaching. These concerns, coupled with lack of protected time and funding, serve as barriers toward changes in critical care graduate medical education.
Journal of Continuing Education in The Health Professions | 2006
Linda Casebeer; Kathryn M. Andolsek; Maziar Abdolrasulnia; Joseph S. Green; Norman W. Weissman; Erica R. Pryor; Shimin Zheng; Thomas Terndrup
Introduction: Much of the international community has an increased awareness of potential biologic, chemical, and nuclear threats and the need for physicians to rapidly acquire new knowledge and skills in order to protect the publics health. The present study evaluated the educational effectiveness of an online bioterrorism continuing medical education (CME) activity designed to address clinical issues involving suspected bioterrorism and reporting procedures in the United States. Methods: This was a retrospective survey of physicians who had completed an online CME activity on bioterrorism compared with a nonparticipant group who had completed at least 1 unrelated online CME course from the same medical school Web site and were matched on similar characteristics. An online survey instrument was developed to assess clinical and systems knowledge and confidence in recognition of illnesses associated with a potential bioterrorism attack. A power calculation indicated that a sample size of 100 (50 in each group) would achieve 90% power to detect a 10% to 15% difference in test scores between the two groups. Results: Compared with nonparticipant physicians, participants correctly diagnosed anthrax (p = .01) and viral exanthem (p = .01), but not smallpox, more frequently than nonparticipants. Participants knew more frequently than nonparticipants who to contact regarding a potential bioterrorism event (p = .03) Participants were more confident than nonparticipants about finding information to guide diagnoses of patients with biologic exposure (p = .01), chemical exposure (p = .02), and radiation exposure (p = .04). Discussion: An online bioterrorism course shows promise as an educational intervention in preparing physicians to better diagnose emerging rare infections, including those that may be associated with a bioterrorist event, in increasing confidence in diagnosing these infections, and in reporting of such infections for practicing physicians.
Academic Medicine | 2009
Alisa Nagler; Kathryn M. Andolsek; Jamie S. Padmore
Portfolios have emerged in graduate medical education despite lack of consensus on their definition, purpose, or usefulness. Portfolios can be used as a tool for residents to record their accomplishments, reflect on their experiences, and gain formative feedback. This exercise may help prepare physicians for lifelong learning as well as enhance patient care. The Accreditation Council for Graduate Medical Education has endorsed and may soon require the use of portfolios as an assessment tool to evaluate resident competence. However, using portfolios for summative evaluation purposes such as making high-stakes decisions on resident promotion or matriculation may deter resident candidness. In addition, the use of portfolios in clinical settings raises issues unique to the health care setting such as patient privacy, disclosure of clinical information, and professional liability exposure of physicians. It is not clear that peer-review statutes that sometimes protect educational materials used in teaching and evaluation of residents would also bar disclosure and/or evidentiary use of portfolio contents. Is the teaching institution, resident, or graduate vulnerable to requests and subpoenas for the portfolio contents? If so, then a residents documentation of insecurities, suboptimal performance, or bad outcomes would be ripe for discovery in a medical malpractice lawsuit. If embraced too quickly and without sufficient reflection on the nuances of implementation, this well-intentioned initiative may present unintended legal consequences.
Academic Medicine | 2014
Marie Caulfield; Kathryn M. Andolsek; Douglas Grbic; Lindsay Brewer Roskovensky
Purpose To examine the psychometric adequacy of a tolerance for ambiguity (TFA) scale for use with medical students. Also, to examine the relationship of TFA to a variety of demographic and personal variables in a national sample of entering U.S. medical students. Method The authors used data from the 2013 Association of American Medical Colleges Matriculating Student Questionnaire in which questions on TFA were included for the first time that year. Data from 13,867 entering medical students were analyzed to examine the psychometric properties of the TFA scale. In addition, the relationships of TFA to sex, age, perceived stress, and desire to work in an underserved area were analyzed. Finally, the relationship of TFA to specialty preference was examined. Results The TFA scale was found to be psychometrically adequate for use in a medical student population. TFA was found to be higher in men and in older students. Lower TFA was associated with higher perceived stress levels. Students with higher TFA were more likely to express desire to work in an underserved area. Different levels of TFA may be associated with certain specialty preferences. Conclusions These findings support the assessment of TFA to understand how this personal characteristic may interact with the medical school experience and with specialty choice. Longitudinal work in this area will be critical to increase this understanding.
Medical Teacher | 2007
Alison S. Clay; Emil R. Petrusa; M. Harker; Kathryn M. Andolsek
Background: This article illustrates the creation of a specialty specific portfolio that can be used by several different residency programs to document resident competence during a given rotation. Methods: Three different disciplines (anesthesiology, surgery and medicine) worked together to create a critical care medicine portfolio. We began by reviewing the curriculum requirements for critical care medicine and organized these requirements into the six ACGME core competencies. We then developed learner led exercises in each core competency that were specific to critical care. Each exercise includes assessment of resident knowledge and application, an evaluation of the exercise, a learner self-assessment of skill, and a review of performance by a faculty member. Portfolio entries are highlighted in a multi-disciplinary weekly conference and posted on a critical care web site at our University. Conclusions: Creation of specialty specific portfolio reduces redundancy between disciplines, allows for increased time to be spent on the development of exercises specific to rotation objectives, and aids program directors in the collection of portfolio entries for each resident over the course of a residency.
International Journal of Medical Education | 2016
Kendall E. Bradley; Kathryn M. Andolsek
Objectives To pilot test if Orthopaedic Surgery residents could self-assess their performance using newly created milestones, as defined by the Accreditation Council on Graduate Medical Education. Methods In June 2012, an email was sent to Program Directors and administrative coordinators of the154 accredited Orthopaedic Surgery Programs, asking them to send their residents a link to an online survey. The survey was adapted from the Orthopaedic Surgery Milestone Project. Completed surveys were aggregated in an anonymous, confidential database. SAS 9.3 was used to perform the analyses. Results Responses from 71 residents were analyzed. First and second year residents indicated through self-assessment that they had substantially achieved Level 1 and Level 2 milestones. Third year residents reported they had substantially achieved 30/41, and fourth year residents, all Level 3 milestones. Fifth year, graduating residents, reported they had substantially achieved 17 Level 4 milestones, and were extremely close on another 15. No milestone was rated at Level 5, the maximum possible. Earlier in training, Patient Care and Medical Knowledge milestones were rated lower than the milestones reflecting the other four competencies of Practice Based Learning and Improvement, Systems Based Practice, Professionalism, and Interpersonal Communication. The gap was closed by the fourth year. Conclusions Residents were able to successfully self-assess using the 41 Orthopaedic Surgery milestones. Respondents’ rate improved proficiency over time. Graduating residents report they have substantially, or close to substantially, achieved all Level 4 milestones. Milestone self-assessment may be a useful tool as one component of a program’s overall performance assessment strategy.
BMC Medical Education | 2014
Alisa Nagler; Kathryn M. Andolsek; Mariah Rudd; Richard Sloane; David W. Musick; Lorraine Basnight
BackgroundProfessionalism has been an important tenet of medical education, yet defining it is a challenge. Perceptions of professional behavior may vary by individual, medical specialty, demographic group and institution. Understanding these differences should help institutions better clarify professionalism expectations and provide standards with which to evaluate resident behavior.MethodsDuke University Hospital and Vidant Medical Center/East Carolina University surveyed entering PGY1 residents. Residents were queried on two issues: their perception of the professionalism of 46 specific behaviors related to training and patient care; and their own participation in those specified behaviors. The study reports data analyses for gender and institution based upon survey results in 2009 and 2010. The study received approval by the Institutional Review Boards of both institutions.Results76% (375) of 495 PGY1 residents surveyed in 2009 and 2010 responded. A majority of responders rated all 46 specified behaviors as unprofessional, and a majority had either observed or participated in each behavior. For all 46 behaviors, a greater percentage of women rated the behaviors as unprofessional. Men were more likely than women to have participated in behaviors. There were several significant differences in both the perceptions of specified behaviors and in self-reported observation of and/or involvement in those behaviors between institutions.Respondents indicated the most important professionalism issues relevant to medical practice include: respect for colleagues/patients, relationships with pharmaceutical companies, balancing home/work life, and admitting mistakes. They reported that professionalism can best be assessed by peers, patients, observation of non-medical work and timeliness/detail of paperwork.ConclusionDefining professionalism in measurable terms is a challenge yet critical in order for it to be taught and assessed. Recognition of the differences by gender and institution should allow for tailored teaching and assessment of professionalism so that it is most meaningful. A shared understanding of what constitutes professional behavior is an important first step.
Journal of Graduate Medical Education | 2014
Gail M. Sullivan; Deborah Simpson; David A. Cook; Nicole M. DeIorio; Kathryn M. Andolsek; Lawrence Opas; Ingrid Philibert; Lalena M. Yarris
BACKGROUND Despite an explosion of medical education research and publications, it is not known how medical educator consumers decide what to read or apply in their practice. OBJECTIVE To determine how consumers of medical education research define quality and value. METHODS Journal of Graduate Medical Education editors performed a literature search to identify articles on medical education research quality published between 2000 and 2013, surveyed medical educators for their criteria for judging quality, and led a consensus-building workshop at a 2013 Association of American Medical Colleges meeting to further explore how users defined quality in education research. The workshop used standard consensus-building techniques to reach concept saturation. Attendees then voted for the 3 concepts they valued most in medical education research. RESULTS The 110 survey responses generated a list of 37 overlapping features in 10 categories considered important aspects of quality. The literature search yielded 27 articles, including quality indexes, systematic and narrative reviews, and commentaries. Thirty-two participants, 12 facilitators, and 1 expert observer attended the workshop. Participants endorsed the following features of education research as being most valuable: (1) provocative, novel, or challenged established thinking; (2) adhered to sound research principles; (3) relevant to practice, role, or needs; (4) feasible, practical application in real-world settings; and (5) connection to a conceptual framework. CONCLUSIONS Medical educators placed high value on rigorous methods and conceptual frameworks, consistent with published quality indexes. They also valued innovative or provocative work, feasibility, and applicability to their setting. End-user opinions of quality may illuminate how educators translate knowledge into practice.
Academic Medicine | 2016
Kathryn M. Andolsek
The Medical Student Performance Evaluation (MSPE) was introduced as a refinement of the prior “dean’s letter” to provide residency program directors with a standardized comprehensive assessment of a medical student’s performance throughout medical school. The author argues that, although the MSPE was created with good intentions, many have questioned its efficacy in predicting performance during residency. The author asserts that, despite decades of use and some acknowledged improvement, the MSPE remains a suboptimal tool for informing program directors’ decisions about which applicants to interview and rank. In the current approach to MSPEs, there may even be some inherent conflicts of interest that cannot be overcome. In January 2015, an MSPE Task Force was created to review the MSPE over three years and recommend changes to its next iteration. The author believes, however, that expanding this collaborative effort between undergraduate and graduate medical education and other stakeholders could optimize the MSPE’s standardization and transparency. The author offers six recommendations for achieving this goal: developing a truly standardized MSPE template; improving faculty accountability in student assessment; enhancing transparency in the MSPE; reconsidering the authorship responsibility of the MSPE; including assessment of compliance with administrative tasks and peer assessments in student evaluations; and embracing milestones for evaluation of medical student performance.