Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Winnie Wade is active.

Publication


Featured researches published by Winnie Wade.


Medical Education | 2008

Implementing workplace-based assessment across the medical specialties in the United Kingdom

James R Wilkinson; Jim Crossley; Andrew Wragg; Peter Mills; George Cowan; Winnie Wade

Objectives  To evaluate the reliability and feasibility of assessing the performance of medical specialist registrars (SpRs) using three methods: the mini‐clinical evaluation exercise (mini‐CEX), directly observed procedural skills (DOPS) and multi‐source feedback (MSF) to help inform annual decisions about the outcome of SpR training.


Medical Teacher | 2011

Assessment of professionalism: Recommendations from the Ottawa 2010 Conference

Brian Hodges; Shiphra Ginsburg; Richard L. Cruess; Sylvia R. Cruess; Rhena Delport; Fred Hafferty; Ming-Jung Ho; Eric S. Holmboe; Matthew C. Holtman; Sadayoshi Ohbu; Charlotte E. Rees; Olle ten Cate; Yusuke Tsugawa; Walther N. K. A. van Mook; Val Wass; Tim Wilkinson; Winnie Wade

Over the past 25 years, professionalism has emerged as a substantive and sustained theme, the operationalization and measurement of which has become a major concern for those involved in medical education. However, how to go about establishing the elements that constitute appropriate professionalism in order to assess them is difficult. Using a discourse analysis approach, the International Ottawa Conference Working Group on Professionalism studied some of the dominant notions of professionalism, and in particular the implications for its assessment. The results presented here reveal different ways of thinking about professionalism that can lead towards a multi-dimensional, multi-paradigmatic approach to assessing professionalism at different levels: individual, inter-personal, societal–institutional. Recommendations for research about professionalism assessment are also presented.


Medical Education | 2011

Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales

Jim Crossley; Gavin J. Johnson; Joe Booth; Winnie Wade

Medical Education 2011: 45: 560–569


Medical Education | 2002

When enough is enough: a conceptual basis for fair and defensible practice performance assessment

Lambert Schuwirth; Lesley Southgate; Gayle G. Page; Neil Paget; J M J Lescop; S R Lew; Winnie Wade; M Barón‐Maldonado

Introduction An essential element of practice performance assessment involves combining the results of various procedures in order to see the whole picture. This must be derived from both objective and subjective assessment, as well as a combination of quantitative and qualitative assessment procedures. Because of the severe consequences an assessment of practice performance may have, it is essential that the procedure is both defensible to the stakeholders and fair in that it distinguishes well between good performers and underperformers.


Medical Education | 2002

Linking assessment to learning: a new route to quality assurance in medical practice

R S Handfield-Jones; Karen Mann; M E Challis; Sjoerd Hobma; Daniel Klass; I. C. McManus; N S Paget; I J Parboosingh; Winnie Wade; T J Wilkinson

Background  If continuing professional development is to work and be sensible, an understanding of clinical practice is needed, based on the daily experiences of doctors within the multiple factors that determine the nature and quality of practice. Moreover, there must be a way to link performance and assessment to ensure that ongoing learning and continuing competence are, in reality, connected. Current understanding of learning no longer holds that a doctor enters practice thoroughly trained with a lifetimes storehouse of knowledge. Rather a doctors ongoing learning is a ‘journey’ across a practice lifetime, which involves the doctor as a person, interacting with their patients, other health professionals and the larger societal and community issues.


Postgraduate Medical Journal | 2007

Problems with using a supervisor’s report as a form of summative assessment

Tim J Wilkinson; Winnie Wade

The place of a supervisor report when used as a summative assessment of clinical workplace based learning is discussed Within clinical medicine, the apprenticeship model is traditional, and highly valued. It relies on a close relationship between a supervisor and a trainee. When it comes to assessing the trainee, who better to ask than the supervisor? On the face of it, this approach makes good sense and has contributed to formalising ways of seeking such an opinion. As one source of feedback, such an opinion is highly valuable. In recent times, though, such reports are now being increasingly used as a form of summative assessment—that is, the basis on which decisions about the trainee’s progress are made. This practice relies on the assumption that such a report is always a valid and reliable assessment method. We wish to challenge this assumption. This paper aims to distil and explain the fundamental flaws of this type of assessment, and offers an alternative solution that not only aids learning, but does so on the basis of more objective and unbiased information. We suggest that the use of a supervisor report change from that of an assessment tool to one that becomes a summary of the results of a variety of assessments. Typically, this is a form with a number of criteria deemed to be important for a trainee to achieve. The supervisor is asked to tick a box that best applies to that trainee’s level of competence or achievement. For example, “the trainee has developed a level of knowledge commensurate with his or her level of training” or “the trainee is reliable” or “the trainee communicates well with patients and peers”. Alongside these statements are levels of accomplishment, such as “below expectations, marginally below expectations, marginally above expectations, above expectations, well above expectations”. …


Medical Teacher | 2011

A single generic multi-source feedback tool for revalidation of all UK career-grade doctors: does one size fit all?

Lucy MacKillop; Jim Crossley; Pirashanthie Vivekananda-Schmidt; Winnie Wade; Mary Armitage

Background: The UK Department of Health is considering a single, generic multi-source feedback (MSF) questionnaire to inform revalidation. Method: Evaluation of an implementation pilot, reporting: response rates, assessor mix, question redundancy and participants’ perceptions. Reliability was estimated using Generalisability theory. Results: A total of 12,540 responses were received on 977 doctors. The mean time taken to complete an MSF exercise was 68.2 days. The mean number of responses received per doctor was 12.0 (range 1–17) with no significant difference between specialties. Individual question response rates and participants’ comments about questions indicate that some questions are less appropriate for some specialities. There was a significant difference in the mean score between specialities. Despite guidance, there were significant differences in the mix of assessors across specialties. More favourable scores were given by progressively more junior doctors. Nurses gave the most reliable scores. Conclusions: It is feasible to electronically administer a generic questionnaire to a large population of doctors. Generic content is appropriate for most but not all specialties. The differences in mean scores and the reliability of the MSF between specialties may be in part due to the specialty differences in assessor mix. Therefore the number and assessor mix should be standardised at specialty level and scores should not be compared across specialties.


Medical Education | 2013

Do assessor comments on a multi-source feedback instrument provide learner-centred feedback?

Pirashanthie Vivekananda-Schmidt; Lucy MacKillop; Jim Crossley; Winnie Wade

Free‐text comments in multi‐source feedback are intended to facilitate change in the assessees practice. This study was designed to utilise a large dataset of free‐text comments obtained in a national pilot study in order to investigate how helpful these free‐text comments may be to assessees.


The Clinical Teacher | 2009

The Acute Care Assessment Tool: a new assessment in acute medicine

Gavin J. Johnson; Winnie Wade; James Barrett; Michael Jones

T he Royal College of Physicians has produced a new training programme for trainee doctors, consisting of two new competency-based curricula (JRCPTB 2007a, 2007b) and an electronic portfolio (e-portfolio). The General Internal Medicine (Acute) curriculum aims to train doctors to become competent in the delivery of effective care in the acute setting. The Generic curriculum aims to deliver doctors that are equipped with generic doctorly competencies, i.e. to practise within a sound moral, legal, ethical and professional framework, by the end of the specialist training.


Academic Medicine | 2009

A Blueprint to Assess Professionalism: Results of a Systematic Review

Tim Wilkinson; Winnie Wade; L Doug Knock

Collaboration


Dive into the Winnie Wade's collaboration.

Top Co-Authors

Avatar

Jim Crossley

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Wragg

Royal College of Physicians

View shared research outputs
Top Co-Authors

Avatar

David Parry

Royal College of Physicians

View shared research outputs
Top Co-Authors

Avatar

George Cowan

Royal College of Physicians

View shared research outputs
Top Co-Authors

Avatar

James Barrett

Clatterbridge Cancer Centre NHS Foundation Trust

View shared research outputs
Top Co-Authors

Avatar

Joe Booth

Royal College of Physicians

View shared research outputs
Top Co-Authors

Avatar

Lucy MacKillop

Royal College of Physicians

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Mills

Royal College of Physicians

View shared research outputs
Researchain Logo
Decentralizing Knowledge