Alisa Nagler
Duke University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alisa Nagler.
Medical Decision Making | 2014
Cara Ansher; Dan Ariely; Alisa Nagler; Mariah Rudd; Janet Schwartz; Ankoor Shah
Background. American health care is transitioning to electronic physician ordering. These computerized systems are unique because they allow custom order interfaces. Although these systems provide great benefits, there are also potential pitfalls, as the behavioral sciences have shown that the very format of electronic interfaces can influence decision making. The current research specifically examines how defaults in electronic order templates affect physicians’ treatment decisions and medical errors. Methods. Forty-five medical residents completed order sets for 3 medical case studies. Participants were randomly assigned to receive order sets with either “opt-in” defaults (options visible but unselected) or “opt-out” defaults (options visible and preselected). Results compare error rates between conditions and examine the type and severity of errors most often made with opt-in versus opt-out defaults. Results. Opt-out defaults resulted in a greater number of items ordered and specifically increased commission errors (overordering) compared with opt-in defaults. However, while opt-in defaults resulted in fewer orders, they also increased omission errors. When the severity of the errors is taken into account, the default effects seem limited to less severe errors. Conclusion. The defaults used in electronic order sets influence medical treatment decisions when the consequences to a patient’s health are low. This pattern suggests that physicians cognitively override incorrect default choices but only to a point, and it implies tradeoffs that maximize accuracy and minimize cognitive effort. Results indicate that defaults for low-impact items on electronic templates warrant careful attention because physicians are unlikely to override them.
Academic Medicine | 2009
Alisa Nagler; Kathryn M. Andolsek; Jamie S. Padmore
Portfolios have emerged in graduate medical education despite lack of consensus on their definition, purpose, or usefulness. Portfolios can be used as a tool for residents to record their accomplishments, reflect on their experiences, and gain formative feedback. This exercise may help prepare physicians for lifelong learning as well as enhance patient care. The Accreditation Council for Graduate Medical Education has endorsed and may soon require the use of portfolios as an assessment tool to evaluate resident competence. However, using portfolios for summative evaluation purposes such as making high-stakes decisions on resident promotion or matriculation may deter resident candidness. In addition, the use of portfolios in clinical settings raises issues unique to the health care setting such as patient privacy, disclosure of clinical information, and professional liability exposure of physicians. It is not clear that peer-review statutes that sometimes protect educational materials used in teaching and evaluation of residents would also bar disclosure and/or evidentiary use of portfolio contents. Is the teaching institution, resident, or graduate vulnerable to requests and subpoenas for the portfolio contents? If so, then a residents documentation of insecurities, suboptimal performance, or bad outcomes would be ripe for discovery in a medical malpractice lawsuit. If embraced too quickly and without sufficient reflection on the nuances of implementation, this well-intentioned initiative may present unintended legal consequences.
BMC Medical Education | 2014
Alisa Nagler; Kathryn M. Andolsek; Mariah Rudd; Richard Sloane; David W. Musick; Lorraine Basnight
BackgroundProfessionalism has been an important tenet of medical education, yet defining it is a challenge. Perceptions of professional behavior may vary by individual, medical specialty, demographic group and institution. Understanding these differences should help institutions better clarify professionalism expectations and provide standards with which to evaluate resident behavior.MethodsDuke University Hospital and Vidant Medical Center/East Carolina University surveyed entering PGY1 residents. Residents were queried on two issues: their perception of the professionalism of 46 specific behaviors related to training and patient care; and their own participation in those specified behaviors. The study reports data analyses for gender and institution based upon survey results in 2009 and 2010. The study received approval by the Institutional Review Boards of both institutions.Results76% (375) of 495 PGY1 residents surveyed in 2009 and 2010 responded. A majority of responders rated all 46 specified behaviors as unprofessional, and a majority had either observed or participated in each behavior. For all 46 behaviors, a greater percentage of women rated the behaviors as unprofessional. Men were more likely than women to have participated in behaviors. There were several significant differences in both the perceptions of specified behaviors and in self-reported observation of and/or involvement in those behaviors between institutions.Respondents indicated the most important professionalism issues relevant to medical practice include: respect for colleagues/patients, relationships with pharmaceutical companies, balancing home/work life, and admitting mistakes. They reported that professionalism can best be assessed by peers, patients, observation of non-medical work and timeliness/detail of paperwork.ConclusionDefining professionalism in measurable terms is a challenge yet critical in order for it to be taught and assessed. Recognition of the differences by gender and institution should allow for tailored teaching and assessment of professionalism so that it is most meaningful. A shared understanding of what constitutes professional behavior is an important first step.
Academic Medicine | 2016
Stephen DeMeo; Alisa Nagler; Mitchell T. Heflin
PROBLEM Health professions education (HPE) has become a core component of the mission of academic health centers (AHCs) nationwide. The volume of HPE research projects being reviewed has increased, presenting new challenges for institutional review boards (IRBs). As HPE research becomes increasingly sophisticated in its design and methods, IRBs and researchers alike have a duty to better understand its unique characteristics. Researchers must be better able to conceptualize and describe their research to IRBs, and IRBs should be able to provide timely review and assure protection of research subjects (or participants). APPROACH The creation of HPE research-specific IRB templates may be one way to improve the interactions between education researchers and IRBs. This report describes the development and early implementation of an HPE research-specific IRB template at Duke University from 2013 to 2014. OUTCOMES Early adopters have noted increased ease of preparation and submission, while IRB staff have reported improved proposal clarity and more attention to protecting learners as research participants. Focus during educational or training sessions about the new template has shifted-from merely a description of the new submission process to a more comprehensive education series that includes discussion of regulatory definitions, examination of case studies, and opportunity for audience feedback. NEXT STEPS Continued collection of quantitative and qualitative data regarding the implementation of this IRB template will help its developers more precisely describe its effects on HPE research projects. Formalizing and streamlining the interactions between HPE researchers and IRBs is an important goal for all AHCs.
Medical Teacher | 2010
Alisa Nagler; Kathryn M. Andolsek; Kristin L. Dossary; Joanne Schlueter; Kevin A. Schulman
Duke University Hospital Office of Graduate Medical Education and Duke Universitys Fuqua School of Business collaborated to offer a Health Policy lecture series to residents and fellows across the institution, addressing the “Systems-based Practice” competency.During the first year, content was offered in two formats: live lecture and web/podcast. Participants could elect the modality which was most convenient for them. In Year Two, the format was changed so that all content was web/podcast and a quarterly live panel discussion was led by module presenters or content experts. Lecture evaluations, qualitative focus group feedback, and post-test data were analyzed.A total of 77 residents and fellows from 8 (of 12) Duke Graduate Medical Education departments participated. In the first year, post-test results were the same for those who attended the live lectures and those who participated via web/podcast. A greater number of individuals participated in Year Two. Participants from both years expressed the need for health policy content in their training programs. Participants in both years valued a hybrid format for content delivery, recognizing a desire for live interaction with the convenience of accessing web/podcasts at times and locations convenient for them. A positive unintended consequence of the project was participant networking with residents and fellows from other specialties.
Academic Medicine | 2013
Kathryn M. Andolsek; Gwendolyn Murphy; Alisa Nagler; Peggy R. Moore; Joanne Schlueter; John L. Weinerth; Michael S. Cuffe; Victor J. Dzau
The Duke Medicine Graduate Medical Education Quasi-Endowment, established in 2006, provides infrastructure support and encourages educational innovation. The authors describe Duke’s experience with the “grassroots innovation” part of the fund, the Duke Innovation Fund, and discuss the Innovation Fund’s processes for application, review, and implementation, and also outcomes, impact, and intended and unintended consequences. In the five years of the Innovation Fund described (2007–2011), 105 projects have been submitted, and 78 have been funded. Thirty-seven projects have been completed. Approved funding ranged from
Academic Medicine | 2017
Luba Dumenco; Deborah L. Engle; Kristen H. Goodell; Alisa Nagler; Robin K. Ovitsh; Shari A. Whicker
2,363 to
Journal of Graduate Medical Education | 2015
Rebecca D. Blanchard; Alisa Nagler; Anthony R. Artino
348,750, with an average award of
Perspectives on medical education | 2018
Lara Varpio; Alisa Nagler
66,391. This represents 42% of funding originally requested. Funding could be requested for a period of 6 months to 3 years. The average duration of projects was 27 months, with a range from 6 months to 36 months. Eighty percent of projects were completed on time. Two projects were closed because of lack of progress and failure to adhere to reporting requirements. Thirty-nine are ongoing. Program directors report great success in meeting project outcomes and concrete impacts on resident and faculty attitudes and performance. Ninety-two percent report that their projects would have never been accomplished without this funding. Projects have resulted in at least 68 posters, abstracts, and peer-reviewed presentations. At least 12 peer-reviewed manuscripts were published. There has been tremendous diversity of projects; all 13 clinical departments have been represented. Interdepartmental and intradepartmental program cooperation has increased. This modest seed money has resulted in demonstrable sustainable impacts on teaching and learning, and increased morale and scholarly recognition.
Journal for nurses in professional development | 2017
Pamela B. Edwards; Jean B. Rea; Marilyn H. Oermann; Ellen J. Hegarty; Judy Prewitt; Mariah Rudd; Susan G. Silva; Alisa Nagler; David Turner; Stephen DeMeo
After participating in a group peer-review exercise at a workshop presented by Academic Medicine and MedEdPORTAL editors at the 2015 Association of American Medical Colleges Medical Education Meeting, the authors realized that the way their work group reviewed a manuscript was very different from the way by which they each would have reviewed the paper as an individual. Further, the group peer-review process yielded more robust feedback for the manuscripts authors than did the traditional individual peer-review process. This realization motivated the authors to reconvene and collaborate to write this Commentary to share their experience and propose the expanded use of group peer review in medical education scholarship.The authors consider the benefits of a peer-review process for reviewers, including learning how to improve their own manuscripts. They suggest that the benefits of a team review model may be similar to those of teamwork and team-based learning in medicine and medical education. They call for research to investigate this, to provide evidence to support group review, and to determine whether specific paper types would benefit most from team review (e.g., particularly complex manuscripts, those receiving widely disparate initial individual reviews). In addition, the authors propose ways in which a team-based approach to peer review could be expanded by journals and institutions. They believe that exploring the use of group peer review potentially could create a new methodology for skill development in research and scholarly writing and could enhance the quality of medical education scholarship.