Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J. Lindsey Lane is active.

Publication


Featured researches published by J. Lindsey Lane.


Academic Medicine | 2009

Evaluating the Performance of Medical Educators: A Novel Analysis Tool to Demonstrate the Quality and Impact of Educational Activities

Latha Chandran; Maryellen E. Gusic; Constance D. Baldwin; Teri L. Turner; Elisa Zenni; J. Lindsey Lane; Dorene Balmer; Miriam Bar-on; Daniel A. Rauch; Diane Indyk; Larry D. Gruppen

Purpose Traditional promotion standards rely heavily on quantification of research grants and publications in the curriculum vitae. The promotion and retention of educators is challenged by the lack of accepted standards to evaluate the depth, breadth, quality, and impact of educational activities. The authors sought to develop a practical analysis tool for the evaluation of educator portfolios (EPs), based on measurable outcomes that allow reproducible analysis of the quality and impact of educational activities. Method The authors, 10 veteran educators and an external expert evaluator, used a scholarly, iterative consensus-building process to develop the tool and test it using real EPs from educational scholars who followed an EP template. They revised the template in parallel with the analysis tool to ensure that EP data enabled valid and reliable evaluation. The authors created the EP template and analysis tool for scholar and program evaluation in the Educational Scholars Program, a three-year national certification program of the Academic Pediatric Association. Results The analysis tool combines 18 quantitative and 25 qualitative items, with specifications, for objective evaluation of educational activities and scholarship. Conclusions The authors offer this comprehensive, yet practical tool as a method to enhance opportunities for faculty promotions and advancement, based on well-defined and documented educational outcome measures. It is relevant for clinical educators across disciplines and across institutions. Future studies will test the interrater reliability of the tool, using data from EPs written using the revised template.


Teaching and Learning in Medicine | 2006

Development and Evaluation of an Interactive Multimedia Clinical Skills Teaching Program Designed for the Pediatric Clerkship

Anthony J Frisby; J. Lindsey Lane; Anna Marie Carr; Ellen Ross; Ruth P. Gottlieb

Background and Purpose: We evaluated the physical-examination section of a multimedia program developed to teach infant history and physical-examination skills. Methods: A total of 71 students participated: one group viewed only the physical-examination section (PX), one the history section (HX), one none of the program (CX). We assessed physical-examination skills by direct observation of medical students performing an abdominal exam and scored using a checklist at baseline, immediately after intervention, and at the end of the pediatric clerkship. We analyzed results using analysis of variance with repeated measures. Results: Baseline scores were PX = 2.5, HX = 2.8. The PX group scored significantly higher immediately postintervention at 6.8 compared to the HX group (3.1). At the end of the clerkship, significant differences between the groups remained. Final group mean scores were PX = 5.5, HX = 4.4, and CX = 2.7. Conclusion: The program improved examination skills with attenuation over 6 weeks.


Academic Pediatrics | 2011

Observation of Resident Clinical Skills: Outcomes of a Program of Direct Observation in the Continuity Clinic Setting

Ellen K. Hamburger; Sandra Cuzzi; Dale A. Coddington; Angela M. Allevi; Joseph Lopreiato; Rachel Y. Moon; Clifton E. Yu; J. Lindsey Lane

OBJECTIVE To assess the feasibility of a new multi-institutional program of direct observation and report what faculty observed and the feedback they provided. METHODS A program of direct observation of real patient encounters was implemented in 3 pediatric residency programs using a structured clinical observation (SCO) form to document what was observed and the feedback given. Outcome variables included the number of observations made, the nature of the feedback provided, resident attitudes about direct observation before and after implementation, and the response of the faculty. RESULTS Seventy-nine preceptors and 145 residents participated; 320 SCO forms were completed. Faculty provided feedback in 4 areas: content, process of the encounter, patient-centered attitudes and behaviors, and interpersonal skills. Feedback was 85% specific and 41% corrective. Corrective feedback was most frequent for physical examination skills. After program implementation, residents reported an increase in feedback and a decrease in discomfort with direct observation; in addition, they agreed that direct observation was a valuable component of their education. Participation rates among faculty were high. CONCLUSIONS Direct observation using SCOs results in timely and specific feedback to residents about behaviors rarely observed in traditional precepting models. Resident competency in these clinical skill domains is critical for assessing, diagnosing, and managing patients. The SCO methodology is a feasible way to provide formative feedback to residents about their clinical skills.


Academic Medicine | 2013

Assessing residents' written learning goals and goal writing skill: Validity evidence for the learning goal scoring rubric

Tai M. Lockspeiser; Patricia Schmitter; J. Lindsey Lane; Janice L. Hanson; Adam A. Rosenberg; Yoon Soo Park

Purpose To provide validity evidence for use of the Learning Goal Scoring Rubric to assess the quality of written learning goals and residents’ goal writing skills. Method This two-part study used the rubric to assess University of Colorado third-year pediatric residents’ written learning goals to obtain validity evidence. In study 1, five raters independently scored 48 goals written in 2010–2011 and 2011–2012 by 48 residents, who also responded to the Jefferson Scale of Physician Lifelong Learning (JeffSPLL). In study 2, two raters independently scored 48 goals written in 2011–2012 by 12 residents. Intraclass correlation coefficients (ICCs) assessed rater agreement to provide evidence for response process. Generalizability theory assessed internal structure. Independent-samples Mann–Whitney U tests and correlations assessed relationship to other variables. Content was matched to published literature and instructional methods. Results The ICC was 0.71 for the overall rubric. In study 1, where the generalizability study’s (G study’s) object of measurement was learning goals, the phi coefficient was 0.867. In study 2, where the G study’s object of measurement was the resident (goal writing skill), the phi coefficient was 0.751. The total mean score of residents with goal writing training was significantly higher than that of those without (7.54 versus 4.98, P < .001). Correlation between goal quality and JeffSPLL score was not significant. Investigators agreed that the content matched the published literature and instructional methods. Conclusions Preliminary validity evidence indicates that this scoring rubric can assess learning goal quality and goal writing skill.


Academic Pediatrics | 2015

The Referral and Consultation Entrustable Professional Activity: Defining the Components in Order to Develop a Curriculum for Pediatric Residents

Ellen K. Hamburger; J. Lindsey Lane; Dewesh Agrawal; Claire Boogaard; Janice L. Hanson; Jessica Weisz; Mary C. Ottolini

From the Department of Pediatrics, George Washington University, Children’s National Health System, Office of Medical Education, Washington, DC (Dr Hamburger, Dr Agrawal, Dr Boogaard, Dr Weisz, and Dr Ottolini); Departments of Pediatrics (Dr Lane and Dr Hanson), and Family Medicine (Dr Hanson), University of Colorado School of Medicine, Children’s Hospital Colorado, Aurora, Colo The authors declare that they have no conflict of interest. Address correspondence to Ellen K. Hamburger, MD, Department of Pediatrics, George Washington University, Children’s National Medical System, Office of Medical Education, 111 Michigan Ave NW, Washington, DC 20037 (e-mail: [email protected]).


Academic Pediatrics | 2017

Narrative Derived From Medical Student Reflection in Action: Lessons Learned and Implications for Assessment

J. Lindsey Lane; Jennifer B. Soep; Janice L. Hanson

A process and tool that prompts learners to think about and reflect on their clinical performance was implemented. Learner narrative reflections about their work and faculty feedback, both captured in the moment, provided data for decisions about level of performance in a competency-based assessment system.


Academic Pediatrics | 2018

Tools for Learning About the Referral and Consultation Process for Pediatric Residents

Ellen K. Hamburger; Sarah Muradian; Alicia Widge; J. Lindsey Lane; Dewesh Agrawal; Claire Boogaard; Janice L. Hanson; Mary C. Ottolini

Management of referral and consultation is an entrustable professional activity for pediatric residents; however, few tools exist to teach these skills. We designed and implemented tools to prompt discussion, feedback, and reflection about the process of referral, notably including the familys perspective.


Pediatrics | 2000

Structured Clinical Observations: A Method to Teach Clinical Skills With Limited Time and Financial Resources

J. Lindsey Lane; Ruth P. Gottlieb


JAMA | 2001

Documenting and Comparing Medical Students' Clinical Experiences

Susan L. Rattner; Daniel Z. Louis; Carol Rabinowitz; Jonathan E. Gottlieb; Thomas J. Nasca; Fred W. Markham; Ruth P. Gottlieb; John W. Caruso; J. Lindsey Lane; J. Jon Veloski; Mohammadreza Hojat; Joseph S. Gonnella


Ambulatory Pediatrics | 2004

Improving the Interviewing and Self-Assessment Skills of Medical Students: Is it Time to Readopt Videotaping as an Educational Tool?

J. Lindsey Lane; Ruth P. Gottlieb

Collaboration


Dive into the J. Lindsey Lane's collaboration.

Top Co-Authors

Avatar

Janice L. Hanson

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Adam A. Rosenberg

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Tai M. Lockspeiser

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Ruth P. Gottlieb

Thomas Jefferson University

View shared research outputs
Top Co-Authors

Avatar

Constance D. Baldwin

University of Rochester Medical Center

View shared research outputs
Top Co-Authors

Avatar

Ellen K. Hamburger

George Washington University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patricia Schmitter

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Claire Boogaard

George Washington University

View shared research outputs
Researchain Logo
Decentralizing Knowledge