Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Courtney West is active.

Publication


Featured researches published by Courtney West.


Medical Education | 2011

Do study strategies predict academic performance in medical school

Courtney West; Mark Sadoski

Medical Education 2011: 45: 696–703


International Journal of Medical Education | 2014

Are study strategies related to medical licensing exam performance

Courtney West; Terri Kurz; Sherry Smith; Lori Graham

Objectives: To examine the relationship between study strategies and performance on a high stakes medical licensing exam entitled the United States Medical Licensing Examination Step 1. Methods: The action research project included seventy nine student participants at the Texas A&M Health Science Center College of Medicine during their pre-clinical education. Data collection included pre-matriculation and matriculation academic performance data, standardized exam data, and the Learning and Study Strategies Instrument. Multiple regression analyses were conducted. For both models, the dependent variable was the Step 1 score, and the independent variables included Medical College Admission Test, Undergraduate Grade Point Average, Year 1 Average, Year 2 Average, Customized National Board of Medical Examiners Average, Comprehensive Basic Science Exam score, and Learning and Study Strategy Instrument sub-scores. Model 2 added Comprehensive Basic Science Self-Assessment average. Results: Concentration (Model 1 - β = .264; Model 2 - β = .254) was the only study strategy correlated with Step 1 performance. The other statistically significant predictors were Customized National Board of Medical Examiners Average (β = .315) and Year 2 Average (β = .280) in Model 1 and Comprehensive Basic Science Self-Assessment Average (β = .338) in Model 2. Conclusions: There does appear to be a relationship between the study strategy concentration and Step 1 licensing exam performance. Teaching students to practice and utilize certain techniques to improve concentration skills when preparing for and taking exams may help improve licensing exam scores.


Journal of Interprofessional Education and Practice | 2016

Implementation of interprofessional education (IPE) in 16 U.S. medical schools: Common practices, barriers and facilitators

Courtney West; Lori Graham; Ryan T. Palmer; Marissa Fuqua Miller; Erin K. Thayer; Margaret L. Stuber; Linda Awdishu; Rachel A. Umoren; Maria Wamsley; Elizabeth A. Nelson; Pablo Joo; James W. Tysinger; Paul George; Patricia A. Carney

BACKGROUNDnEnhanced patient outcomes and accreditation criteria have led schools to integrate interprofessional education (IPE). While several studies describe IPE curricula at individual institutions, few examine practices across multiple institutions.nnnPURPOSEnTo examine the IPE integration at different institutions and determine gaps where there is potential for improvement.nnnMETHODnIn this mixed methods study, we obtained survey results from 16 U.S. medical schools, 14 of which reported IPE activities.nnnRESULTSnThe most common collaboration was between medical and nursing schools (93%). The prevalent format was shared curriculum, often including integrated modules (57%). Small group activities represented the majority (64%) of event settings, and simulation-based learning, games and role-play (71%) were the most utilized learning methods. Thirteen schools (81.3%) reported teaching IPE competencies, but significant variation existed. Gaps and barriers in the study include limitations of using a convenience sample, limited qualitative analysis, and survey by self-report.nnnCONCLUSIONSnMost IPE activities focused on the physician role. Implementation challenges included scheduling, logistics and financial support. A need for effective faculty development as well as measures to examine the link between IPE learning outcomes and patient outcomes were identified.


Teaching and Learning in Medicine | 2015

Conceptualizing Interprofessional Teams as Multi-Team Systems—Implications for Assessment and Training

Courtney West; Karen Landry; Anna Graham; Lori Graham; Anna T. Cianciolo; Adina Kalet; Michael A. Rosen; Deborah Witt Sherman

SGEA 2015 CONFERENCE ABSTRACT (EDITED) Evaluating Interprofessional Teamwork During a Large-Scale Simulation Courtney West, Karen Landry, Anna Graham, and Lori Graham. Construct: This study investigated the multidimensional measurement of interprofessional (IPE) teamwork as part of large-scale simulation training. Background: Healthcare team function has a direct impact on patient safety and quality of care. However, IPE team training has not been the norm. Recognizing the importance of developing team-based collaborative care, our College of Nursing implemented an IPE simulation activity called Disaster Day and invited other professions to participate. The exercise consists of two sessions: one in the morning and another in the afternoon. The disaster scenario is announced just prior to each session, which consists of team building, a 90-minute simulation, and debriefing. Approximately 300 Nursing, Medicine, Pharmacy, Emergency Medical Technicians, and Radiology students and over 500 standardized and volunteer patients participated in the Disaster Day event. To improve student learning outcomes, we created 3 competency-based instruments to evaluate collaborative practice in multidimensional fashion during this exercise. Approach: A 20-item IPE Team Observation Instrument designed to assess interprofessional teams attainment of Interprofessional Education Collaborative (IPEC) competencies was completed by 20 faculty and staff observing the Disaster Day simulation. One hundred sixty-six standardized patients completed a 10-item Standardized Patient IPE Team Evaluation Instrument developed from the IPEC competencies and adapted items from the 2014 Henry et al. PIVOT Questionnaire. This instrument assessed the standardized or volunteer patients perception of the teams collaborative performance. A 29-item IPE Teams Perception of Collaborative Care Questionnaire, also created from the IPEC competencies and divided into 5 categories of Values/Ethics, Roles and Responsibilities, Communication, Teamwork, and Self-Evaluation, was completed by 188 students including 99 from Nursing, 43 from Medicine, 6 from Pharmacy, and 40 participants who belonged to more than one component, were students at another institution, or did not indicate their institution. The team instrument was designed to assess each team members perception of how well the team and him- or herself met the competencies. Five of the items on the team perceptions questionnaire mirrored items on the standardized patient evaluation: demonstrated leadership practices that led to effective teamwork, discussed care and decisions about that care with patient, described roles and responsibilities clearly, worked well together to coordinate care, and good/effective communication. Results: Internal consistency reliability of the IPE Team Observation Instrument was 0.80. In 18 of the 20 items, more than 50% of observers indicated the item was demonstrated. Of those, 6 of the items were observed by 50% to 75% of the observers, and the remaining 12 were observed by more than 80% of the observers. Internal consistency reliability of the IPE Teams Perception of Collaborative Care Instrument was 0.95. The mean response score—1 (strongly disagree) to 4 (strongly agree)—was calculated for each section of the instrument. The overall mean score was 3.57 (SD = .11). Internal consistency reliability of the Standardized Patient IPE Team Evaluation Instrument was 0.87. The overall mean score was 3.28 (SD = .17). The ratings for the 5 items shared by the standardized patient and team perception instruments were compared using independent sample t tests. Statistically significant differences (p < .05) were present in each case, with the students rating themselves higher on average than the standardized patients did (mean differences between 0.2 and 0.6 on a scale of 1–4). Conclusions: Multidimensional, competency-based instruments appear to provide a robust view of IPE teamwork; however, challenges remain. Due to the large scale of the simulation exercise, observation-based assessment did not function as well as self- and standardized patient-based assessment. To promote greater variation in observer assessments during future Disaster Day simulations, we plan to adjust the rating scale from “not observed,” “observed,” and “not applicable” to a 4-point scale and reexamine interrater reliability.


Annals of behavioral science and medical education | 2011

Comparing the Effects of Mental Imagery Rehearsal and Physical Practice on Learning Lumbar Puncture by Medical Students

Rachel Bramson; Charles W. Sanders; Mark Sadoski; Courtney West; Robert Wiprud; Mark English; Michael Palm; Alan Xenakis

Using mental imagery in clinical skills instruction can be a valuable teaching strategy. Prior studies have supported its use in the teaching of a variety of clinical skills including basic surgery and venipuncture. We extended this research to lumbar puncture. After viewing an instructional video, medical students received instruction on how to perform a lumbar puncture on simulators. The students were then randomized into two groups with one group receiving additional practice on the simulators and the other group receiving guided mental imagery practice. Students then performed a lumbar puncture as part of an Objective Structured Clinical Examination (OSCE) and were graded on a reliable rating instrument developed for this study. Consistent with prior studies, there was no statistically significant difference in performance between the group receiving additional physical practice and the group receiving guided mental imagery practice. Mental imagery practice appears to be an effective and cost-efficient method to teach lumbar puncture as well as a lifelong learning skill.


Medical Education Online | 2015

Tools to investigate how interprofessional education activities link to competencies.

Courtney West; Michael A. Veronin; Karen Landry; Terri Kurz; Bree Watzak; Barbara J. Quiram; Lori Graham

Integrating interprofessional education (IPE) activities and curricular components in health professions education has been emphasized recently by the inclusion of accreditation standards across disciplines. The Interprofessional Education Collaborative (IPEC) established IPE competencies in 2009, but evaluating how activities link to competencies has not been investigated in depth. The purpose of this project is to investigate how well two IPE activities align with IPEC competencies. To evaluate how our IPE activities met IPEC competencies, we developed a checklist and an observation instrument. A brief description of each is included as well as the outcomes. We analyzed Disaster Day, a simulation exercise that includes participants from Nursing, Medicine, and Pharmacy, and Interprofessional Healthcare Ethics (IPHCE), a course that introduced medical, nursing, and pharmacy students to ethical issues using didactic sessions and case discussions. While both activities appeared to facilitate the development of IPE competencies, Disaster Day aligned more with IPEC competencies than the IPHCE course and appears to be a more comprehensive way of addressing IPEC competencies. However, offering one IPE activity or curricular element is not sufficient. Having several IPE options available, utilizing the tools we developed to map the IPE curriculum and evaluating competency coverage is recommended.


Simulation in healthcare : journal of the Society for Simulation in Healthcare | 2016

Simulated Disaster Day: Benefit From Lessons Learned Through Years of Transformation From Silos to Interprofessional Education.

Laura L. Livingston; Courtney West; Jerry Livingston; Karen Landry; Bree Watzak; Lori Graham

Summary Statement Disaster Day is a simulation event that began in the College of Nursing and has increased exponentially in size and popularity for the last 8 years. The evolution has been the direct result of reflective practice and dedicated leadership in the form of students, faculty, and administration. Its development and expansion into a robust interprofessional education activity are noteworthy because it gives health care professions students an opportunity to work in teams to provide care in a disaster setting. The “authentic” learning situation has enhanced student knowledge of roles and responsibilities and seems to increase collaborative efforts with other disciplines. The lessons learned and modifications made in our Disaster Day planning, implementation, and evaluation processes are shared in an effort to facilitate best practices for other institutions interested in a similar activity.


Archive | 2013

Simulation in Internal Medicine

Paul E. Ogden; Courtney West; Lori Graham; Curtis Mirkes; Colleen Y. Colbert

Simulation-based medical education (SBME) has become a regular feature of undergraduate and graduate medical education in Internal Medicine. Within undergraduate medical education, this teaching modality is used to facilitate medical knowledge acquisition and to teach and assess clinical skills, diagnostic reasoning, basic technical skills, and patient communication. SBME also allows students to practice roles with hospital teams prior to residency. In Internal Medicine residency training programs, SBME is used to teach procedural skills and hospital teamwork, such as code teams; practice infrequent events; and evaluate competencies. Continuing medical education also utilizes SBME to actively engage physicians and facilitate lifelong learning. In Canada, Internal Medicine certification requirements have an integrated simulation component. While this practice is not widespread outside Canada, it may be a common element in Internal Medicine certification and recertification in the future.


Medical Education | 2016

Getting SMART about teaching objective writing

Colleen Y. Colbert; Courtney West; Lori Graham; Lily C. Pien

The novice writer drafted her first scholarly article and circulated it to three independent critical readers. The critical readers and novice writer represented four South African universities and were diverse with regard to content knowledge and writing experience. The critical readers’ expertise included a dean of a faculty, a senior lecturer and a faculty development specialist. Each critical reader gave individual written comments and verbal feedback to the novice writer. Following the feedback sessions, the novice writer recorded structured personal reflections on the process and the feedback provided. Audio-recordings were transcribed, and together with the three sets of written comments were thematically analysed using a deductive approach. Consensus on emergent themes from the transcripts was reached by all participants. What lessons were learned? Feedback from the diverse group of critical readers resulted in a rigorous and in-depth review. The multiple feedback resulted in convergence of meaning for the novice writer irrespective of the divergence of the readers’ input. The most experienced reader embodied a mentorship role and gave high-level guidance, and the two lesser-experienced readers were instructional in tone and provided detailed editorial feedback. The feedback focused on promoting precision and clarity of the message, enhancing the scientific style of writing and meeting academic conventions. The structured reflections of the novice writer highlighted the anxiety at the magnitude of the changes needed, but identified the feedback as essential to the development of a plan of action. Despite the lack of in-depth content knowledge, the critical readers could identify and address the weaknesses in the academic elements. The characteristics of critical readers and the collaborative, non-competitive nature of the relationship are relevant to engagement with the scholarly activity of critical reading. The writer benefitted from the different approaches as well as the multiple feedback opportunities, which resulted in substantive and detailed feedback that was arguably more than that a single experienced published researcher would have provided. Novice writers can develop their writing skills by involving small groups of academics to act as critical readers and need not be dependent on the willingness of experienced and published researchers.


International Journal of Medical Education | 2016

The relationship between study strategies and academic performance

Yuanyuan Zhou; Lori Graham; Courtney West

Objectives To investigate if and to what extent the Learning and Study Strategy Inventory (LASSI) and the Self-Directed Learning Readiness Scale (SDLRS) yield academic performance predictors; To examine if LASSI findings are consistent with previous research. Methods Medical school students completed the LASSI and SDLRS before their first and second years (n = 168). Correlational and regression analyses were used to determine the predictive value of the LASSI and the SDLRS. Paired t-tests were used to test if the two measurement points differed. Bivariate correlations and R2s were compared with five other relevant studies. Results The SDLRS was moderately correlated with all LASSI subscales in both measures (r(152) =.255, p=.001) to (r(152) =.592, p =.000). The first SDLRS, nor the first LASSI, were predictive of academic performance. The second LASSI measure was a significant predictor of academic performance (R2(138) = 0.188, p = .003). Six prior LASSI studies yielded a range of R2s from 10-49%. Conclusions The SDLRS is moderately correlated with all LASSI subscales. However, the predictive value of the SDLRS and LASSI differ. The SDLRS does not appear to be directly related to academic performance, but LASSI subscales: Concentration, Motivation, Time Management, and Test Strategies tend to be correlated. The explained LASSI variance ranges from 10% to 49%, indicating a small to substantial effect. Utilizing the LASSI to provide medical school students with information about their strengths and weaknesses and implementing targeted support in specific study strategies may yield positive academic performance outcomes.

Collaboration


Dive into the Courtney West's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Colleen Y. Colbert

Cleveland Clinic Lerner College of Medicine

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna Lama

West Virginia University

View shared research outputs
Researchain Logo
Decentralizing Knowledge