Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lisa N. Conforti is active.

Publication


Featured researches published by Lisa N. Conforti.


Medical Education | 2011

Opening the black box of clinical skills assessment via observation: a conceptual model

Jennifer R. Kogan; Lisa N. Conforti; Elizabeth Bernabeo; William Iobst; Eric S. Holmboe

Medical Education 2011: 45: 1048–1060


Medical Education | 2012

Faculty staff perceptions of feedback to residents after direct observation of clinical skills.

Jennifer R. Kogan; Lisa N. Conforti; Elizabeth Bernabeo; Steven J. Durning; Karen E. Hauer; Eric S. Holmboe

Medical Education 2012: 46 : 201–215


Journal of Graduate Medical Education | 2015

Reflections on the First 2 Years of Milestone Implementation.

Eric S. Holmboe; Kenji Yamazaki; Laura Edgar; Lisa N. Conforti; Nicholas Yaghmour; Rebecca S. Miller; Stanley J. Hamstra

The Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Medical Specialties (ABMS) collectively constitute the foundation of professional self-regulation in the United States. In February 1999, the 2 organizations approved 6 general competencies broadly relevant for all medical practice, followed by the official launch of the Outcomes Project in 2001. It was expected that the competencies would be an antidote to overspecification of accreditation standards, and that they would empower programs to create training programs grounded in meaningful outcomes in a developmental approach. As many programs can attest, the implementation of outcomes-based (eg, competency-based) medical education has been challenging. One reason has been the difficulty in implementing the competencies in both curriculum and assessment. Program leaders lacked shared mental models within their own training programs, accompanied by a lack of shared understanding nationally within disciplines. It is important to remember that 1 of the thorny problems the milestones were intended to address was the sources of unwanted and unwarranted variability in educational and, by extension, clinical outcomes. In addition, the community cannot improve at scale what cannot be measured, and prior frames and approaches to measurement were insufficient and ineffective. A key goal for milestones thus is to help improve the state and quality of measurement through better assessment in graduate medical education to facilitate the improved outcomes everyone desires. Approximately 10 years ago, conversations began on how to more effectively and meaningfully operationalize the competencies to help improve the design of residency and fellowship programs through the use of a developmental framework. In parallel, the ACGME began to explore mechanisms to move the accreditation system to a focus on outcomes using a continuous quality improvement philosophy. Developmental milestones, using narratives to describe in more descriptive terms the professional trajectories of residents, were seen as a way to move the outcomes project forward. Starting in 2007, the disciplines of internal medicine, pediatrics, and surgery began to create developmental milestones for the 6 competencies. Surgery would subsequently delay the development of their milestones focusing first on the SCORE curriculum. The ACGME began to restructure its accreditation processes in 2009, and soon after, milestone groups were constituted for all specialties. Milestone writing groups were cosponsored by the ACGME and the ABMS member certification boards. Early groups had significant latitude in developing their subcompetencies and milestones; specialties that started the process after 2010 used a standard template. Each milestone set was subjected to review by the educational community in the specialty. BOX 1 provides an overview of the purposes of the milestones across key stakeholders, and FIGURE 1 provides an example of a key driver diagram of milestones as an educational and clinical intervention. As FIGURE 1 highlights, milestones can potentially trigger a number of drivers, or mechanisms, to help enable changes in residency and fellowship education. In 2013, the milestones were officially launched in 7 core specialties (emergency medicine, internal medicine, neurological surgery, orthopaedic surgery, pediatrics, diagnostic radiology, and urology) as a formative, continuous quality improvement component of the new accreditation system. The remaining core disciplines and the majority of subspecialties implemented the milestones starting in July 2014. We have now reached an important ‘‘milestone’’ in the implementation process, and our commentary proDOI: http://dx.doi.org/10.4300/JGME-07-03-43


Journal of Graduate Medical Education | 2013

Early feedback on the use of the internal medicine reporting milestones in assessment of resident performance.

Eva Aagaard; Gregory C. Kane; Lisa N. Conforti; Sarah Hood; Kelly J. Caverzagie; Cynthia D. Smith; Davoren A. Chick; Eric S. Holmboe; William Iobst

BACKGROUND The educational milestones were designed as a criterion-based framework for assessing resident progression on the 6 Accreditation Council for Graduate Medical Education competencies. OBJECTIVE We obtained feedback on, and assessed the construct validity and perceived feasibility and utility of, draft Internal Medicine Milestones for Patient Care and Systems-Based Practice. METHODS All participants in our mixed-methods study were members of competency committees in internal medicine residency programs. An initial survey assessed participant and program demographics; focus groups obtained feedback on the draft milestones and explored their perceived utility in resident assessment, and an exit survey elicited input on the value of the draft milestones in resident assessment. Surveys were tabulated using descriptive statistics. Conventional content analysis method was used to assess the focus group data. RESULTS Thirty-four participants from 17 programs completed surveys and participated in 1 of 6 focus groups. Overall, the milestones were perceived as useful in formative and summative assessment of residents. Participants raised concerns about the length and complexity of some draft milestones and suggested specific changes. The focus groups also identified a need for faculty development. In the exit survey, most participants agreed that the Patient Care and Systems-Based Practice Milestones would help competency committees assess trainee progress toward independent practice. CONCLUSIONS Draft reporting milestones for 2 competencies demonstrated significant construct validity in both the content and response process and the perceived utility for the assessment of resident performance. To ensure success, additional feedback from the internal medicine community and faculty development will be necessary.


American Journal of Medical Quality | 2009

The Impact of a Preventive Cardiology Quality Improvement Intervention on Residents and Clinics: A Qualitative Exploration:

Elizabeth Bernabeo; Lisa N. Conforti; Eric S. Holmboe

Teaching and evaluating quality improvement (QI) is one corollary of new competency requirements in practice- and systems-based learning and improvement. This study explored the impact of the Preventive Cardiology Practice Improvement Module (PC- PIM) on residency clinics. Results from 22 clinic interviews indicated merit in using the PC-PIM to teach QI during residency. Many residents reported increased knowledge and confidence, particularly regarding the value of QI. The majority recognized that QI often leads to improved patient care and outcomes, even in resource poor environments. Conducting aspects of the QI process themselves (eg, chart audit, decision making) led to greater awareness of the patient and systems perspectives. Barriers included a lack of resident buy-in, discontinuity of care, and a lack of institutional support. These findings shed light on how residency clinics engage in QI activities and may aid in the implementation of future QI initiatives in residency more generally. (Am J Med Qual 2009;24: 99-107)


Academic Medicine | 2012

Comparative trial of a web-based tool to improve the quality of care provided to older adults in residency clinics: modest success and a tough road ahead.

Eric S. Holmboe; Brian J. Hess; Lisa N. Conforti; Lorna A. Lynn

Purpose To determine whether residency programs can use a multicomponent, Web-based quality improvement tool to improve the care of older adults. Method The authors conducted an exploratory, cluster-randomized, comparative before–after trial of the Care of the Vulnerable Elderly Practice Improvement Module in the ambulatory clinics of 46 internal medicine and family medicine residency programs, 2006–2008. The main outcomes were the deltas between pre- and post-performance on the Assessing Care of the Vulnerable Elderly (ACOVE) quality measures. Results Of the 46 programs initially selected for the study, 37 (80%) provided both baseline and follow-up data. Performance on all 10 ACOVE measures was poor at baseline (range 8.6%–33.6%). Intervention clinics most frequently chose for improvement fall-risk screening and documentation of end-of-life preferences. The change in the percentage of patients screened for fall risk for the intervention clinics that targeted this measure was significantly greater than the change observed by the control clinics (+23.3% versus +9.7%, P = .003, odds ratio [OR] = 2.0; 95% confidence interval [CI]: 1.25–3.75), as was the difference observed for documentation of preference for life-sustaining care (+16.4% versus +2.8%, P = .002, OR = 6.3; 95% CI: 2.0–19.6) and surrogate decision maker (+14.3% versus +2.8%, P = .003, OR = 6.8; 95% CI: 1.9–24.4). Conclusions A multicomponent, Web-based, quality improvement tool can help residency programs improve care for older adults, but much work remains for improving the state of care for this population in training settings.


Academic Medicine | 2017

Commitment to Change and Challenges to Implementing Changes After Workplace-based Assessment Rater Training

Jennifer R. Kogan; Lisa N. Conforti; Kenji Yamazaki; William Iobst; Eric S. Holmboe

Purpose Faculty development for clinical faculty who assess trainees is necessary to improve assessment quality and impor tant for competency-based education. Little is known about what faculty plan to do differently after training. This study explored the changes faculty intended to make after workplace-based assessment rater training, their ability to implement change, predictors of change, and barriers encountered. Method In 2012, 45 outpatient internal medicine faculty preceptors (who supervised residents) from 26 institutions participated in rater training. They completed a commitment to change form listing up to five commitments and ranked (on a 1–5 scale) their motivation for and anticipated difficulty implementing each change. Three months later, participants were interviewed about their ability to implement change and barriers encountered. The authors used logistic regression to examine predictors of change. Results Of 191 total commitments, the most common commitments focused on what faculty would change about their own teaching (57%) and increasing direct observation (31%). Of the 183 commitments for which follow-up data were available, 39% were fully implemented, 40% were partially implemented, and 20% were not implemented. Lack of time/competing priorities was the most commonly cited barrier. Higher initial motivation (odds ratio [OR] 2.02; 95% confidence interval [CI] 1.14, 3.57) predicted change. As anticipated difficulty increased, implementation became less likely (OR 0.67; 95% CI 0.49, 0.93). Conclusions While higher baseline motivation predicted change, multiple system-level barriers undermined ability to implement change. Rater-training faculty development programs should address how faculty motivation and organizational barriers interact and influence ability to change.


Academic Medicine | 2016

Do Faculty Benefit From Participating in a Standardized Patient Assessment as Part of Rater Training? A Qualitative Study.

Lisa N. Conforti; Kathryn M. Ross; Eric S. Holmboe; Kogan

Purpose To explore faculty’s experience participating in a standardized patient (SP) assessment where they were observed and assessed and then received feedback about their own clinical skills as part of a rater training faculty development program on direct observation. Method In 2012, 45 general internist teaching faculty from 30 residency programs participated in an eight-station SP assessment with cases covering common clinical scenarios. Twenty-one participants (47%) received verbal feedback from SPs and a performance-based score report. All participants reflected on the experience through an independent written exercise, one-on-one interviews, and a focus group discussion. Grounded theory was used to analyze all three reflections. Results Eleven participants (24%) previously completed an SP assessment post training. Most found the SP assessment valuable and experienced emotions that increased their empathy for learners’ experiences being observed, being assessed, and receiving nonspecific feedback. Participants receiving verbal feedback from SPs described different themes around personal improvement plans compared with the nonfeedback group. Conclusions Faculty experience many of the same emotions as trainees during SP encounters and view SP assessment as a valuable mechanism to improve their own clinical skills and assessments of trainees. SP assessments may be one approach to provide faculty feedback about core clinical skills needed in their own patient care as well as what they are expected to teach trainees. Although actual changes in participants’ clinical or assessor skills were not measured (more research is merited), findings hint at a “dual benefit” from incorporating SP assessment into a faculty development workshop about assessment.


Academic Medicine | 2014

Reconceptualizing variable rater assessments as both an educational and clinical care problem.

Kogan; Lisa N. Conforti; William Iobst; Eric S. Holmboe


Academic Medicine | 2010

What Drives Faculty Ratings of Residentsʼ Clinical Skills? The Impact of Facultyʼs Own Clinical Skills

Jennifer R. Kogan; Brian J. Hess; Lisa N. Conforti; Eric S. Holmboe

Collaboration


Dive into the Lisa N. Conforti's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian J. Hess

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Jennifer R. Kogan

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar

Lorna A. Lynn

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

William Iobst

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Elizabeth Bernabeo

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Kathryn M. Ross

American Board of Internal Medicine

View shared research outputs
Top Co-Authors

Avatar

Amy V. Blue

Medical University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Benjamin Chesluk

American Board of Internal Medicine

View shared research outputs
Researchain Logo
Decentralizing Knowledge