Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chris Ricketts is active.

Publication


Featured researches published by Chris Ricketts.


Medical Education | 2005

Assessment of progress tests

Jane McHarg; Paul Bradley; Suzanne Chamberlain; Chris Ricketts; Judith Searle; John Charles Mclachlan

Background  Progress testing is a form of longitudinal examination which, in principle, samples at regular intervals from the complete domain of knowledge considered a requirement for medical students on completion of the undergraduate programme. Over the course of the programme students improve their scores on the test, enabling them, as well as staff, to monitor their progress.


Medical Education | 2007

‘I’m pickin' up good regressions': the governance of generalisability analyses

Jim Crossley; Jean Russell; Brian Jolly; Chris Ricketts; Chris Roberts; Lambert Schuwirth; John J. Norcini

Context  Investigators applying generalisability theory to educational research and evaluation have sometimes done so poorly. The main difficulties have related to: inadequate or non‐random sampling of effects, dealing with naturalistic data, and interpreting and presenting variance components.


Medical Teacher | 2010

Progress testing internationally

Adrian Freeman; Cees van der Vleuten; Zineb Miriam Nouns; Chris Ricketts

curriculum) are given the same written test. The test is comprehensive by sampling all relevant disciplines in a curriculum usually determined by a fixed blueprint. Because of the need for wide sampling, items are typically of the multiple-choice type. The test is repeated regularly in time. The same blueprint is used but test items are usually different with every test occasion. So, a comprehensive (across many content areas) cross-sectional (comparing performance of different groups of ability) and longitudinal (comparing performance in time) picture is obtained of the knowledge of learners in relation to the end objectives of a curriculum. This is the first time that so many contributions on progress testing are being published together. Although progress testing was initially developed in the 1970s by two institutions independently, it has taken quite a long time for other schools to adopt this very special testing procedure. The reason is probably twofold. First and foremost, the utility of the testing procedure is not easily understood. The concept of testing is so different from our usual courserelated testing that it takes time to really understand the ideas behind it and see the potential benefits that may result from progress testing. The second reason is that progress testing can be logistically burdensome. It requires considerable effort for test development, test administration and test scoring. The resources and the centralized governance required are probably major obstacles for many institutions to engage in progress testing. Nevertheless, in the recent years, an increasing number of medical schools and other institutions have gained interest in progress testing and are using the method. In order to exchange information and experiences, a symposium on progress testing was held at International Association of Medical Education (AMEE) 2009. Medical Teacher offered space for subsequent publishing of papers which you will find in this issue. By the way of


Medical Teacher | 2010

Beyond assessment: Feedback for individuals and institutions based on the progress test

Lee Coombes; Chris Ricketts; Adrian Freeman; Jeff Stratford

Background: Progress testing is used at Peninsula Medical School to test applied medical knowledge four times a year using a 125-item multiple choice test. Items within each test are classified and matched to the curriculum blueprint. Aim: To examine the use of item classifications as part of a quality assurance process and to examine the range of available feedback provided after each test or group of tests. Methods: The questions were classified using a single best classification method. These were placed into a simplified version of the progress test assessment blueprint. Average item facilities for individuals and cohorts were used to provide feedback to individual students and curriculum designers. Results: The analysis shows that feedback can be provided at a number of levels, and inferences about various groups can be made. It demonstrates that learning mostly occurs in the early years of the course, but when examined longitudinally, it shows how different patterns of learning exist in different curriculum areas. It also shows that the effect of changes in the curriculum may be monitored through these data. Conclusions: Used appropriately, progress testing can provide a wide range of feedback to every individual or group of individuals in a medical school.


Advances in Health Sciences Education | 2010

Are multiple choice tests fair to medical students with specific learning disabilities

Chris Ricketts; Julie Brice; Lee Coombes

The purpose of multiple choice tests of medical knowledge is to estimate as accurately as possible a candidate’s level of knowledge. However, concern is sometimes expressed that multiple choice tests may also discriminate in undesirable and irrelevant ways, such as between minority ethnic groups or by sex of candidates. There is little literature to establish whether multiple choice tests may also discriminate against students with specific learning disabilities (SLDs), in particular those with a diagnosis of dyslexia, and whether the commonly-used accommodations allow such students to perform up to their capability. We looked for evidence to help us determine whether multiple choice tests could be relied upon to test all medical students fairly, regardless of disability. We analyzed the mean scores of over 900 undergraduate medical students on eight multiple-choice progress tests containing 1,000 items using a repeated-measures analysis of variance. We included disability, gender and ethnicity as possible explanatory factors, as well as year group. There was no significant difference between mean scores of students with an SLD who had test accommodations and students with no SLD and no test accommodation. Virtually all students were able to complete the tests within the allowed time. There were no significant differences between the mean scores of known minority ethnic groups or between the genders. We conclude that properly-designed multiple-choice tests of medical knowledge do not systematically discriminate against medical students with specific learning disabilities.


Advances in Health Sciences Education | 2012

Student perceptions of the progress test in two settings and the implications for test deployment

Louise Wade; Christopher Harrison; James Hollands; Karen Mattick; Chris Ricketts; Val Wass

Background The Progress Test (PT) was developed to assess student learning within integrated curricula. Whilst it is effective in promoting and rewarding deep approaches to learning in some settings, we hypothesised that implementation of the curriculum (design and assessment) may impact on students’ preparation for the PT and their learning. Aim To compare students’ perceptions of and preparations for the PT at two medical schools. Method Focus groups were used to generate items for a questionnaire. This was piloted, refined, and then delivered at both schools. Exploratory factor analysis identified the main factors underpinning response patterns. ANOVA was used to compare differences in response by school, year group and gender. Results Response rates were 640 (57%) and 414 (47%) at Schools A and B, respectively. Three major factors were identified: the PT’s ability to (1) assess academic learning (2) support clinical learning; (3) the PT’s impact on exam preparation. Significant differences were found between settings. In the school with early clinical contact, more frequent PTs and no end of unit tests, students were more likely to appreciate the PT as a support for learning, perceive it as fair and valid, and use a deeper approach to learning—but they also spent longer preparing for the test. Conclusion Different approaches to the delivery of the PT can impact significantly on student study patterns. The learning environment has an important impact on student perceptions of assessment and approach to learning. Careful decisions about PT deployment must be taken to ensure its optimal impact.


Medical Teacher | 2010

Choosing and designing knowledge assessments: Experience at a new medical school

Adrian Freeman; Chris Ricketts

Background: Curriculum developers have a wide choice of assessment methods in all aspects of medical education including the specific area of medical knowledge. When selecting the appropriate tool, there is an increasing literature to provide a robust evidence base for developments or decisions. Aim: As a new medical school, we wished to select the most appropriate method for knowledge assessment. Methods: This article describes how a new medical school came to choose progress testing as its only method of summative assessment of undergraduate medical knowledge. Results: The rationale, implementation, development and performance of the assessment are described. The position after the first cohort of students qualified is evaluated. Conclusion: Progress testing has worked well in a new school. Opportunities for further study and development exist. It is to be hoped that our experiences and evidence will assist and inform others as they consider developments for their own schools.


Medical Education | 2009

Standard setting for progress tests: combining external and internal standards

Chris Ricketts; Adrian Freeman; Lee Coombes

Objectives  There has been little work on standard setting for progress tests and it is common practice to use normative standards. This study aimed to develop a new approach to standard setting for progress tests administered at the point when students approach graduation.


BMJ | 2008

Are national qualifying examinations a fair way to rank medical students? Yes

Chris Ricketts; Julian Archer

Chris Ricketts and Julian Archer argue that a national test is the only fair way to compare medical students, but Ian Noble (doi: 10.1136/bmj.a1279) believes that it will reduce the quality of education


Medical Teacher | 2010

Adaptation of medical progress testing to a dental setting

J.H. Bennett; Adrian Freeman; Lee Coombes; Liz Kay; Chris Ricketts

Although progress testing (PT) is well established in several medical schools, it is new to dentistry. Peninsula College of Medicine and Dentistry has recently established a Bachelor of Dental Surgery programme and has been one of the first schools to use PT in a dental setting. Issues associated with its development and of its adaption to the specific needs of the dental curriculum are considered.

Collaboration


Dive into the Chris Ricketts's collaboration.

Top Co-Authors

Avatar

Adrian Freeman

Peninsula College of Medicine and Dentistry

View shared research outputs
Top Co-Authors

Avatar

Lee Coombes

Peninsula College of Medicine and Dentistry

View shared research outputs
Top Co-Authors

Avatar

Julian Archer

Peninsula College of Medicine and Dentistry

View shared research outputs
Top Co-Authors

Avatar

Stan Zakrzewski

London Metropolitan University

View shared research outputs
Top Co-Authors

Avatar

Anthony Nicholls

Peninsula College of Medicine and Dentistry

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Giovanni Pagliuca

Peninsula College of Medicine and Dentistry

View shared research outputs
Top Co-Authors

Avatar

J.H. Bennett

University College London

View shared research outputs
Researchain Logo
Decentralizing Knowledge