Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Benjamin Chesluk is active.

Publication


Featured researches published by Benjamin Chesluk.


Medical Teacher | 2011

Feedback data sources that inform physician self-assessment

Jocelyn Lockyer; Heather Armson; Benjamin Chesluk; Timothy Dornan; Eric S. Holmboe; Elaine Loney; Karen Mann; Joan Sargeant

Background: Self-assessment is a process of interpreting data about ones performance and comparing it to explicit or implicit standards. Aim: To examine the external data sources physicians used to monitor themselves. Methods: Focus groups were conducted with physicians who participated in three practice improvement activities: a multisource feedback program; a program providing patient and chart audit data; and practice-based learning groups. We used grounded theory strategies to understand the external sources that stimulated self-assessment and how they worked. Results: Data from seven focus groups (49 physicians) were analyzed. Physicians used information from structured programs, other educational activities, professional colleagues, and patients. Data were of varying quality, often from non-formal sources with implicit (not explicit) standards. Mandatory programs elicited variable responses, whereas data and activities the physicians selected themselves were more likely to be accepted. Physicians used the information to create a reference point against which they could weigh their performance using it variably depending on their personal interpretation of its accuracy, application, and utility. Conclusions: Physicians use and interpret data and standards of varying quality to inform self-assessment. Physicians may benefit from regular and routine feedback and guidance on how to seek out data for self-assessment.


Advances in Health Sciences Education | 2013

The utility of vignettes to stimulate reflection on professionalism: theory and practice

Elizabeth Bernabeo; Eric S. Holmboe; Kathryn M. Ross; Benjamin Chesluk; Shiphra Ginsburg

Professionalism remains a substantive theme in medical literature. There is an emerging emphasis on sociological and complex adaptive systems perspectives that refocuses attention from just the individual role to working within one’s system to enact professionalism in practice. Reflecting on responses to professional dilemmas may be one method to help practicing physicians identify both internal and external factors contributing to (un) professional behavior. We present a rationale and theoretical framework that supports and guides a reflective approach to the self assessment of professionalism. Guided by principles grounded in this theoretical framework, we developed and piloted a set of vignettes on professionally challenging situations, designed to stimulate reflection in practicing physicians. Findings show that participants found the vignettes to be authentic and typical, and reported the group experience as facilitative around discussions of professional ambiguity. Providing an opportunity for physicians to reflect on professional behavior in an open and safe forum may be a practical way to guide physicians to assess themselves on professional behavior and engage with the complexities of their work. The finding that the focus groups led to reflection at a group level suggests that effective reflection on professional behavior may require a socially interactive process. Emphasizing both the behaviors and the internal and external context in which they occur can thus be viewed as critically important for understanding professionalism in practicing physicians.


Academic Medicine | 2015

Reviewing Residents' Competence : A Qualitative Study of the Role of Clinical Competency Committees in Performance Assessment

Karen E. Hauer; Benjamin Chesluk; William Iobst; Eric S. Holmboe; Robert B. Baron; Christy Boscardin; Olle ten Cate; Patricia O'Sullivan

Purpose Clinical competency committees (CCCs) are now required in graduate medical education. This study examined how residency programs understand and operationalize this mandate for resident performance review. Method In 2013, the investigators conducted semistructured interviews with 34 residency program directors at five public institutions in California, asking about each institution’s CCCs and resident performance review processes. They used conventional content analysis to identify major themes from the verbatim interview transcripts. Results The purpose of resident performance review at all institutions was oriented toward one of two paradigms: a problem identification model, which predominated; or a developmental model. The problem identification model, which focused on identifying and addressing performance concerns, used performance data such as red-flag alerts and informal information shared with program directors to identify struggling residents. In the developmental model, the timely acquisition and synthesis of data to inform each resident’s developmental trajectory was challenging. Participants highly valued CCC members’ expertise as educators to corroborate the identification of struggling residents and to enhance credibility of the committee’s outcomes. Training in applying the milestones to the CCC’s work was minimal. Participants were highly committed to performance review and perceived the current process as adequate for struggling residents but potentially not for others. Conclusions Institutions orient resident performance review toward problem identification; a developmental approach is uncommon. Clarifying the purpose of resident performance review and employing efficient information systems that synthesize performance data and engage residents and faculty in purposeful feedback discussions could enable the meaningful implementation of milestones-based assessment.


Journal of Continuing Education in The Health Professions | 2015

Assessing Interprofessional Teamwork: Pilot Test of a New Assessment Module for Practicing Physicians

Benjamin Chesluk; Siddharta Reddy; Brian J. Hess; Elizabeth Bernabeo; Lorna A. Lynn; Eric S. Holmboe

Introduction: Teamwork is a basic component of all health care, and substantial research links the quality of teamwork to safety and quality of care. The TEAM (Teamwork Effectiveness Assessment Module) is a new Webbased teamwork assessment module for practicing hospital physicians. The module combines self‐assessment, multisource feedback from members of other professions and specialties with whom the physician exercises teamwork, and a structured review of those data with a peer to develop an improvement plan. Methods: We conducted a pilot test of this module with hospitalist physicians to evaluate the feasibility and usefulness of the module in practice, focusing on these specific questions: Would physicians in hospitals of different types and sizes be able to use the module; would the providers identified as raters respond to the request for feedback; would the physicians be able to identify one or more “trusted peers” to help analyze the feedback; and how would physicians experience the module process overall? Results: 20 of 25 physicians who initially volunteered for the pilot completed all steps of the TEAM, including identifying interprofessional teammates, soliciting feedback from their team, and identifying a peer to help review data. Module users described the feedback they received as helpful and actionable, and indicated this was information they would not have otherwise received. Conclusions: The results suggest that a module combining self‐assessment, multisource feedback, and a guided process for interpreting these data can provide help practicing hospital physicians to understand and potentially improve their interprofessional teamwork skills and behaviors.


Journal of Interprofessional Care | 2018

The intersection of professionalism and interprofessional care: development and initial testing of the interprofessional professionalism assessment (IPA)

Jody S. Frost; Dana P. Hammer; Loretta M. Nunez; Jennifer L. Adams; Benjamin Chesluk; Catherine L. Grus; Neil Harvison; Kathy McGuinn; Luke Mortensen; John H. Nishimoto; Anthony Palatta; Margaret Richmond; Elisabeth J. Ross; John H. Tegzes; Alexis L. Ruffin; John P. Bentley

ABSTRACT Valid assessment of interprofessional education and collaborative practice (IPECP) is challenging. The number of instruments that measure various aspects of IPECP, or in various sites is growing, however. The Interprofessional Professionalism Assessment (IPA) measures observable behaviors of health care professionals-in-training that demonstrate professionalism and collaboration when working with other health care providers in the context of people-centered care. The IPA instrument was created by the Interprofessional Professionalism Collaborative (IPC), a national group representing 12 entry-level health professions and one medical education assessment organization. The instrument was created and evaluated over several years through a comprehensive, multi-phasic process: 1) development of construct and observable behaviors, 2) instrument design, expert review and cognitive interviews, and 3) psychometric testing. The IPA contains 26 items representing six domains of professionalism (altruism and caring, excellence, ethics, respect, communication, accountability), and was tested by 233 preceptors rating health profession learners in the final year of their practical training. These preceptors represented 30 different academic institutions across the U.S., worked in various types of practice sites, and evaluated learners representing 10 different entry-level health professions. Exploratory factor analysis suggested four factors (communication, respect, excellence, altruism and caring) using 21 items with the least amount of missing data, and confirmed, for the most part, a priori expectations. Internal consistency reliability coefficients for the entire instrument and its four subscales were high (all greater than 0.9). Psychometric results demonstrate aspects of the IPA’s reliability and validity and its use across multiple health professions and in various practice sites.


Health Affairs | 2010

How Teams Work—Or Don’t—In Primary Care: A Field Study On Internal Medicine Practices

Benjamin Chesluk; Eric S. Holmboe


Health Affairs | 2012

A New Tool To Give Hospitalists Feedback To Improve Interprofessional Teamwork And Advance Patient Care

Benjamin Chesluk; Elizabeth Bernabeo; Brian J. Hess; Lorna A. Lynn; Siddharta Reddy; Eric S. Holmboe


Journal of Graduate Medical Education | 2016

Ensuring Resident Competence: A Narrative Review of the Literature on Group Decision Making to Inform the Work of Clinical Competency Committees

Karen E. Hauer; Olle ten Cate; Christy Boscardin; William Iobst; Eric S. Holmboe; Benjamin Chesluk; Robert B. Baron; Patricia O'Sullivan


Journal of allied health | 2015

Assessment and evaluation in interprofessional education: exploring the field.

Amy V. Blue; Benjamin Chesluk; Lisa N. Conforti; Eric S. Holmboe


Journal of Health Organisation and Management | 2015

How hospitalists work to pull healthcare teams together.

Benjamin Chesluk; Elizabeth Bernabeo; Siddharta Reddy; Lorna A. Lynn; Brian J. Hess; Thor Odhner; Eric S. Holmboe

Collaboration


Dive into the Benjamin Chesluk's collaboration.

Top Co-Authors

Avatar

Eric S. Holmboe

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Elizabeth Bernabeo

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Lorna A. Lynn

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Brian J. Hess

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Siddharta Reddy

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Karen E. Hauer

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

William Iobst

American Board of Internal Medicine

View shared research outputs
Researchain Logo
Decentralizing Knowledge