Jim Crossley
University of Sheffield
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jim Crossley.
Medical Education | 2008
James R Wilkinson; Jim Crossley; Andrew Wragg; Peter Mills; George Cowan; Winnie Wade
Objectives To evaluate the reliability and feasibility of assessing the performance of medical specialist registrars (SpRs) using three methods: the mini‐clinical evaluation exercise (mini‐CEX), directly observed procedural skills (DOPS) and multi‐source feedback (MSF) to help inform annual decisions about the outcome of SpR training.
Medical Education | 2002
Jim Crossley; Gerry Humphris; Brian Jolly
Background Good professional regulation depends on high quality procedures for assessing professional performance. Professional assessment can also have a powerful educational impact by providing transparent performance criteria and returning structured formative feedback.
Medical Education | 2002
Jim Crossley; Helena Davies; Gerry Humphris; Brian Jolly
Context Reliability is defined as the extent to which a result reflects all possible measurements of the same construct. It is an essential measurement characteristic. Unfortunately, there are few objective tests for the most important aspects of the professional role because they are complex and intangible. In addition, professional performance varies markedly from setting to setting and case to case. Both these factors threaten reliability.
Medical Education | 2012
Jim Crossley; Brian Jolly
Medical Education 2012: 46: 28–37
Medical Education | 2011
Jim Crossley; Gavin J. Johnson; Joe Booth; Winnie Wade
Medical Education 2011: 45: 560–569
British Journal of Surgery | 2011
Jim Crossley; J. Marriott; H. Purdie; Jonathan Beard
Most surgical assessment has been aimed at technical proficiency. However, non‐technical skills also affect patient safety and clinical effectiveness. The NOTSS (Non‐Technical Skills for Surgeons) assessment instrument was developed specifically to assess the non‐technical skills of individual surgeons in the operating theatre. This study evaluated NOTSS as a real‐world assessment, with a mix of minimally trained assessors. The evaluation criteria were feasibility, validity and psychometric reliability.
Medical Education | 2008
Chris Roberts; Merrilyn Walton; Imogene Rothnie; Jim Crossley; Patricia M. Lyon; Koshila Kumar; David J. Tiller
Context We wished to determine which factors are important in ensuring interviewers are able to make reliable and valid decisions about the non‐cognitive characteristics of candidates when selecting candidates for entry into a graduate‐entry medical programme using the multiple mini‐interview (MMI).
BJA: British Journal of Anaesthesia | 2009
Jennifer Weller; B Jolly; Mp Misur; Alan Merry; Amanda Jones; Jim Crossley; Karen Pedersen; K Smith
BACKGROUND The Mini-Clinical Evaluation Exercise (Mini-CEX) is a workplace-based assessment tool of potential value in anaesthesia to assess and improve clinical performance. Its reliability and positive educational impact have been reported in other specialities, but not, to date, in anaesthesia. In this study, we evaluated the psychometric characteristics, logistics of application, and impact on the quality of supervision of the Mini-CEX in anaesthesia training. METHODS A Mini-CEX encounter consisted of a single specialist anaesthetist observing a trainee over a defined period of time, completing an online Mini-CEX form with the trainee, and providing written and verbal feedback. We sought trainee and supervisor perspectives on its value and ease of use and used Generalizability Theory to estimate reliability. RESULTS We collected 331 assessments from 61 trainees and 58 assessors. Survey responses strongly supported the positive effect of the Mini-CEX on feedback, its relative feasibility, and acceptance as a potential assessment tool. In this cohort, we found variable assessor stringency and low trainee variation. However, a feasible sample of cases and assessors would produce sufficiently precise scores to decide that performance was satisfactory for each trainee with 95% confidence. To generate scores that could discriminate sufficiently between trainees to allow ranking, a much larger sample of cases would be needed. CONCLUSIONS The Mini-CEX in anaesthesia has strengths and weaknesses. Strengths include: its perceived very positive educational impact and its relative feasibility. Variable assessor stringency means that large numbers of assessors are required to produce reliable scores.
Medical Education | 2007
Jim Crossley; Jean Russell; Brian Jolly; Chris Ricketts; Chris Roberts; Lambert Schuwirth; John J. Norcini
Context Investigators applying generalisability theory to educational research and evaluation have sometimes done so poorly. The main difficulties have related to: inadequate or non‐random sampling of effects, dealing with naturalistic data, and interpreting and presenting variance components.
Medical Education | 2005
Jim Crossley; Helena Davies
Context The clinical consultation is an important aspect of the doctors role. However, there is a particular shortage of methods for assessing its quality, and its complexity makes it a considerable assessment challenge.