Jeffrey S. LaRochelle
Uniformed Services University of the Health Sciences
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jeffrey S. LaRochelle.
Teaching and Learning in Medicine | 2012
Steven J. Durning; Jeffrey S. LaRochelle; Louis N. Pangaro; Anthony R. Artino; John R. Boulet; Cees van der Vleuten; Paul A. Hemmer; Dodd Denton; Lambert Schuwirth
Background: Educational theories predict conflicting results for the effect of increasing the authenticity of the teaching format of complex information on educational outcomes. We sought to determine the effect of increasingly authentic small-group, preclerkship teaching format on clerkship outcomes to further enlighten this debate. Summary: Students enrolled in a prospective randomized crossover trial that involved three content areas. For each content area, three teaching formats were tested. Participants were randomized to teaching format by content area. Clerkship outcomes were performance on an objective structured clinical exam, a DVD exam, internal medicine clerkship grades, and performance on the subject examination. The data were analyzed using a multivariate analysis of covariance. One hundred and thirty-three (78%) students participated. Teaching format did not have a statistically significant effect on any of the specified clerkship outcomes. However, number of patients seen was significantly associated with higher scores in respective outcomes by topic. Conclusions: Second-year teaching format did not directly influence subsequent clerkship performance. Our study adds to the literature by demonstrating that the authenticity of preclinical teaching format does not appear to matter for clerkship performance; however, the number of actual patients seen does appear to influence related clerkship outcomes.
Academic Medicine | 2012
Jeffrey S. LaRochelle; Steven J. Durning; Louis N. Pangaro; Anthony R. Artino; Cees van der Vleuten; Lambert Schuwirth
Purpose To address whether increasingly authentic instructional formats are more effective in improving preclerkship medical students’ performance. Method From 2007 to 2009, the authors conducted a prospective, randomized, crossover study with second-year medical students in a clinical reasoning course at the Uniformed Services University of the Health Sciences. The authors randomly assigned students to one of three cohorts and used instructional formats of differing authenticity (paper, DVD, standardized patient) to teach three subject areas (abdominal pain, anemia, polyuria). Each cohort received one instructional format for each subject area. The authors collected outcome measures (objective structured clinical exam, video quiz, and essay exam scores) at the end of each academic year. They stratified the students into tertiles by first-year grade point average to investigate the impact of instructional formats on learners of different abilities. Results Outcomes for students in the top tertile improved with increased authenticity of the instructional format compared with outcomes for students in the middle and bottom tertiles (0.188 versus −0.038 and −0.201, P = .001 and .027, respectively). However, outcomes for students in the bottom tertile decreased when students were given only the paper case, compared with the middle and top tertiles (−0.374 versus 0.043 and 0.023, respectively, P = .001), but subsequently improved with more authentic instructional formats. Conclusions The authors could not demonstrate that increased authenticity of the instructional format resulted in improved learner performance. However, they believe that there may be some benefit to tailoring preclerkship clinical education based on students’ ability.
Military Medicine | 2015
Aaron Saguil; Ting Dong; Robert J. Gingerich; Kimberly A. Swygert; Jeffrey S. LaRochelle; Anthony R. Artino; David F. Cruess; Steven J. Durning
BACKGROUND The Medical College Admissions Test (MCAT) is a high-stakes test required for entry to most U. S. medical schools; admissions committees use this test to predict future accomplishment. Although there is evidence that the MCAT predicts success on multiple choice-based assessments, there is little information on whether the MCAT predicts clinical-based assessments of undergraduate and graduate medical education performance. This study looked at associations between the MCAT and medical school grade point average (GPA), Medical Licensing Examination (USMLE) scores, observed patient care encounters, and residency performance assessments. METHODS This study used data collected as part of the Long-Term Career Outcome Study to determine associations between MCAT scores, USMLE Step 1, Step 2 clinical knowledge and clinical skill, and Step 3 scores, Objective Structured Clinical Examination performance, medical school GPA, and PGY-1 program director (PD) assessment of physician performance for students graduating 2010 and 2011. RESULTS MCAT data were available for all students, and the PGY PD evaluation response rate was 86.2% (N = 340). All permutations of MCAT scores (first, last, highest, average) were weakly associated with GPA, Step 2 clinical knowledge scores, and Step 3 scores. MCAT scores were weakly to moderately associated with Step 1 scores. MCAT scores were not significantly associated with Step 2 clinical skills Integrated Clinical Encounter and Communication and Interpersonal Skills subscores, Objective Structured Clinical Examination performance or PGY-1 PD evaluations. DISCUSSION MCAT scores were weakly to moderately associated with assessments that rely on multiple choice testing. The association is somewhat stronger for assessments occurring earlier in medical school, such as USMLE Step 1. The MCAT was not able to predict assessments relying on direct clinical observation, nor was it able to predict PD assessment of PGY-1 performance.
Military Medicine | 2015
Jeffrey S. LaRochelle; Ting Dong; Steven J. Durning
PURPOSE Many medical schools across the United States are undergoing curriculum reform designed, in part, to integrate basic sciences and clinical skills. Evidence has suggested that preclerkship courses in clinical skills and clinical reasoning are predictive of student performance on the clerkship. We hypothesized that a combination of outcome measures from preclerkship clinical skills and clinical reasoning courses (Objective Structured Clinical Examination scores, preceptor evaluations, National Board of Medical Examiners subject examination scores, and small group participation grades) would be correlated to performance in internship (program director [PD]evaluation form at end of first postgraduate year). METHODS Outcome measures from preclerkship clinical skills and clinical reasoning courses and PD evaluation forms from 514 medical students graduating between 2009 and 2011 were analyzed in a multiple linear regression model. RESULTS Preclerkship clinical skills and clinical reasoning outcome measures were significant contributors to the linear regression model and were able to explain 13.9% of the variance in expertise and 7.6% of the variance in professionalism as measured by the PD evaluation form. CONCLUSION Clinical skills and clinical reasoning courses during the preclerkship period explained a significant amount of performance at the graduate medical education level. Our data suggest that these courses provide valuable information regarding student abilities in internship. Early recognition of struggling students may provide an opportunity to break a cycle of poor performance that can persist into residency training.
Perspectives on medical education | 2016
Jeffrey S. LaRochelle; Steven J. Durning; John R. Boulet; Cornelis van der Vleuten; Jeroen J. G. van Merrienboer; Jeroen Donkers
IntroductionClinical encounters are often assessed using a checklist. However, without direct faculty observation, the timing and sequence of questions are not captured. We theorized that the sequence of questions can be captured and measured using coherence scores that may distinguish between low and high performing candidates.MethodsA logical sequence of key features was determined using the standard case checklist for an observed structured clinical exam (OSCE). An independent clinician educator reviewed each encounter to provide a global rating. Coherence scores were calculated based on question sequence. These scores were compared with global ratings and checklist scores.ResultsCoherence scores were positively correlated to checklist scores and to global ratings, and these correlations increased as global ratings improved. Coherence scores explained more of the variance in student performance as global ratings improved.DiscussionLogically structured question sequences may indicate a higher performing student, and this information is often lost when using only overall checklist scores.ConclusionsThe sequence test takers ask questions can be accurately recorded, and is correlated to checklist scores and to global ratings. The sequence of questions during a clinical encounter is not captured by traditional checklist scoring, and may represent an important dimension of performance.
Medical Education | 2016
William Kelly; Brian Neubauer; Brian Hemann; Jeffrey S. LaRochelle; J Timothy O'Neill
What problems were addressed? Early clinical encounters are well founded in education theory as providing engagement and opportunity, but they can be resource-intensive and logistically challenging. A novel, small-group exercise designed to provide clinical experience (CLINEX) was embedded in a real heart failure clinic using game-like features, collaboration and subsequent video-recording of oral case presentations which students selfrecorded using their smart phones. As part of the curriculum reform process at our medical school, we were challenged with coming up with a meaningful, real clinical experience for over 170 Year 1 medical students using minimal student contact time, and without any additional faculty support or infrastructure, at short notice. What was tried? Upon arrival at the cardiology clinic, students randomly drew a card which showed their group assignment on one side and their history or physical examination task on the other. Groups consisted of six to eight students covering all major clinical tasks. For 1 hour, half of these groups performed these tasks, guided by the physician, with real clinic patients there for actual medical care. Simultaneously, the other half of the students received a lecture with review of echocardiograms of the patients they would be seeing. In the second hour, these roles were reversed. Students later videotaped themselves (3–7 minutes long) presenting their de-identified patients to their smartphone or other camera as if that camera was the attending physician, guided by a simple behaviourally anchored rubric, and submitted the recording electronically or shared it privately via YouTube. Videos were scored by a faculty member. Students also submitted a brief reflective paragraph. What lessons were learned? A total of 173 Year 1 medical students participated in one of three CLINEX offered, each of which required only three faculty staff and six patients per session. Students who participated in CLINEX performed better during a subsequent standardised angina patient encounter in both history-taking and physical examination checklists (as graded by a physician: scores of 86% versus 83%; p = 0.016, Cohen’s d effect size 0.42) and in patient communication (as graded by the standardised patient using the Essential Elements of Communication tool: 66% versus 63%; p = 0.031, effect size 0.372). Of note, baseline performance as judged in a standardised syncope patient encounter during the week prior to any CLINEX did not differ between these groups. Learner satisfaction was extremely high, with themes of relevance, emotion, authenticity and peer support particularly evident. Limitations included the crowding of some patient examination rooms when patient family members were present. Technical difficulties arose in fewer than 2% of student submissions of their self-recorded videos of oral case presentations. There were no disclosures of patient identifiers, perhaps because the scoring rubric described a significant loss of points if this were to happen. Clinical performance improvement was measured in the short term (within the same 10week pre-clerkship module). Cognitive heart failure (CHF) CLINEX has now engaged over 500 students and data collection continues. Our CLINEX model has been expanded to include successful events in the pulmonary clinic, emergency room and dialysis centre, and in other pre-clerkship modules. A video summary is available at tinyurl.com/chfclinicalexercise.
Teaching and Learning in Medicine | 2014
Irene Alexandraki; Amber T. Pincavage; Susan A. Glod; Beth W. Liston; Carlos Palacio; Deborah J. DeWaay; Shobhina G. Chheda; Nicholas Van Wagoner; Jeffrey S. LaRochelle; Alfred P. Burger; Amy Shaheen; Leigh H. Simmons; Mark J. Fagan; Debra S. Leizman; Joseph T. Wayne; Diane Levine; Karen Szauter; Katherine C. Chretien
This journal watch is sponsored by the Alliance for Clinical Education (ACE). The purpose of this article is to summarize medical education manuscripts from specialty journals that are important and relevant to educators across specialties. Specialties included in our review were cardiology, gastroenterology, general internal medicine, pulmonology, nephrology, hematology and oncology, endocrinology, rheumatology, infectious disease, and neurology. We are grateful to Teaching and Learning in Medicine and ACE for giving us the opportunity to publish this review. The Clerkship Directors in Internal Medicine Research Committee conducted this review. Included are English articles published from September 2006 through September 2007. PubMed was searched for peer-reviewed research publications reporting primary data on medical education. Medical subject heading terms included combinations of medical education, medical student, residency training, practice, undergraduate medical education, graduate medical educat...
Military Medicine | 2015
Ting Dong; Jeffrey S. LaRochelle; Steven J. Durning; Aaron Saguil; Kimberly A. Swygert; Anthony R. Artino
Teaching and Learning in Medicine | 2011
G. Dodd Denton; Amy Shaheen; Wei Wei Lee; Ernie L. Esquivel; William Kelly; Diane Levine; Martin Muntz; Monica Yepes-Rios; Jeffrey S. LaRochelle
Military Medicine | 2009
Jeffrey S. LaRochelle; William R. Gilliland; Dario M. Torre; Elizabeth A. Baker; Alex J. Mechaber; John Poremba; Steven J. Durning