Erik W. Driessen
Maastricht University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Erik W. Driessen.
BMJ | 2008
Erik W. Driessen; Jan van Tartwijk; Tim Dornan
Reflection underpins learning from experience, so how do you foster reflection in your students? This article explores the best ways to do this
Medical Education | 2012
Christopher Watling; Erik W. Driessen; Cees van der Vleuten; Lorelei Lingard
Medical Education 2012: 46 : 192–200
Medical Education | 2009
Karlijn Overeem; Hub Wollersheim; Erik W. Driessen; Kiki M. J. M. H. Lombarts; Geertje van de Ven; Richard Grol; Onyebuchi A. Arah
Objectives Delivery of 360‐degree feedback is widely used in revalidation programmes. However, little has been done to systematically identify the variables that influence whether or not performance improvement is actually achieved after such assessments. This study aims to explore which factors represent incentives, or disincentives, for consultants to implement suggestions for improvement from 360‐degree feedback.
Medical Education | 2012
Janneke M. Frambach; Erik W. Driessen; Li-Chong Chan; Cees van der Vleuten
Medical Education 2012: 46: 738–747
Medical Teacher | 2013
Erik W. Driessen; Fedde Scheele
Workplace-based assessment is more commonly given a lukewarm than a warm welcome by its prospective users. In this article, we summarise the workplace-based assessment literature as well as our own experiences with workplace-based assessment to derive lessons that can facilitate acceptance of workplace-based assessment in postgraduate specialty training. We propose to shift the emphasis in workplace-based assessment from assessment of trainee performance to the learning of trainees. Workplace-based assessment should focus on supporting supervisors in taking entrustment decisions by complementing their “gut feeling” with information from assessments and focus less on assessment and testability. One of the most stubborn problems with workplace-based assessment is the absence of observation of trainees and the lack of feedback based on observations. Non-standardised observations are used to organise feedback. To make these assessments meaningful for learning, it is essential that they are not perceived as summative by their users, that they provide narrative feedback for the learner and that there is a form of facilitation that helps to integrate the feedback in trainees’ self-assessments.
Quality in Higher Education | 2007
Jan van Tartwijk; Erik W. Driessen; Cees van der Vleuten; Karel M. Stokking
ABSTRACT Factors influencing the successful introduction of portfolios are described. A portfolio is a purposeful collection of all kinds of documents and other artefacts that together give an impression of how tasks were fulfilled and how competence has developed. A portfolio can also contain reflections and plans for future development. Although portfolios are often promoted as valuable instruments in innovative educational practices, the introduction of portfolios in everyday education often leads to disappointment. Factors that influence the success of the introduction of portfolios are the match between the purpose of using a portfolio and the portfolio content and structure; the educational configuration in which the portfolio is introduced; the support of teachers, students and educational leaders; and the availability of an adequate infrastructure.
Academic Medicine | 2015
Joan Sargeant; Jocelyn Lockyer; Karen Mann; Eric S. Holmboe; Ivan Silver; Heather Armson; Erik W. Driessen; Tanya MacLeod; Wendy Yen; Kathryn Ross; Mary Power
Purpose To develop and conduct feasibility testing of an evidence-based and theory-informed model for facilitating performance feedback for physicians so as to enhance their acceptance and use of the feedback. Method To develop the feedback model (2011–2013), the authors drew on earlier research which highlights not only the factors that influence giving, receiving, accepting, and using feedback but also the theoretical perspectives which enable the understanding of these influences. The authors undertook an iterative, multistage, qualitative study guided by two recognized research frameworks: the UK Medical Research Council guidelines for studying complex interventions and realist evaluation. Using these frameworks, they conducted the research in four stages: (1) modeling, (2) facilitator preparation, (3) model feasibility testing, and (4) model refinement. They analyzed data, using content and thematic analysis, and used the findings from each stage to inform the subsequent stage. Results Findings support the facilitated feedback model, its four phases—build relationship, explore reactions, explore content, coach for performance change (R2C2)—and the theoretical perspectives informing them. The findings contribute to understanding elements that enhance recipients’ engagement with, acceptance of, and productive use of feedback. Facilitators reported that the model made sense and the phases generally flowed logically. Recipients reported that the feedback process was helpful and that they appreciated the reflection stimulated by the model and the coaching. Conclusions The theory- and evidence-based reflective R2C2 Facilitated Feedback Model appears stable and helpful for physicians in facilitating their reflection on and use of formal performance assessment feedback.
Medical Teacher | 2015
C.P.M. van der Vleuten; Lambert Schuwirth; Erik W. Driessen; Marjan J. B. Govaerts; Sylvia Heeneman
Abstract Programmatic assessment is an integral approach to the design of an assessment program with the intent to optimise its learning function, its decision-making function and its curriculum quality-assurance function. Individual methods of assessment, purposefully chosen for their alignment with the curriculum outcomes and their information value for the learner, the teacher and the organisation, are seen as individual data points. The information value of these individual data points is maximised by giving feedback to the learner. There is a decoupling of assessment moment and decision moment. Intermediate and high-stakes decisions are based on multiple data points after a meaningful aggregation of information and supported by rigorous organisational procedures to ensure their dependability. Self-regulation of learning, through analysis of the assessment information and the attainment of the ensuing learning goals, is scaffolded by a mentoring system. Programmatic assessment-for-learning can be applied to any part of the training continuum, provided that the underlying learning conception is constructivist. This paper provides concrete recommendations for implementation of programmatic assessment.
JAMA | 2015
Lorette Stammen; Renée E. Stalmeijer; Emma Paternotte; Andrea Oudkerk Pool; Erik W. Driessen; Fedde Scheele; Laurents P. S. Stassen
IMPORTANCE Increasing health care expenditures are taxing the sustainability of the health care system. Physicians should be prepared to deliver high-value, cost-conscious care. OBJECTIVE To understand the circumstances in which the delivery of high-value, cost-conscious care is learned, with a goal of informing development of effective educational interventions. DATA SOURCES PubMed, EMBASE, ERIC, and Cochrane databases were searched from inception until September 5, 2015, to identify learners and cost-related topics. STUDY SELECTION Studies were included on the basis of topic relevance, implementation of intervention, evaluation of intervention, educational components in intervention, and appropriate target group. There was no restriction on study design. DATA EXTRACTION AND SYNTHESIS Data extraction was guided by a merged and modified version of a Best Evidence in Medical Education abstraction form and a Cochrane data coding sheet. Articles were analyzed using the realist review method, a narrative review technique that focuses on understanding the underlying mechanisms in interventions. Recurrent patterns were identified in the data through thematic analyses. Resulting themes were discussed within the research team until consensus was reached. MAIN OUTCOMES AND MEASURES Main outcomes were factors that promote education in delivering high-value, cost-conscious care. FINDINGS The initial search identified 2650 articles; 79 met the inclusion criteria, of which 14 were randomized clinical trials. The majority of the studies were conducted in North America (78.5%) using a pre-post interventional design (58.2%; at least 1619 participants); they focused on practicing physicians (36.7%; at least 3448 participants), resident physicians (6.3%; n = 516), and medical students (15.2%; n = 275). Among the 14 randomized clinical trials, 12 addressed knowledge transmission, 7 reflective practice, and 1 supportive environment; 10 (71%) concluded that the intervention was effective. The data analysis suggested that 3 factors aid successful learning: (1) effective transmission of knowledge, related, for example, to general health economics and prices of health services, to scientific evidence regarding guidelines and the benefits and harms of health care, and to patient preferences and personal values (67 articles); (2) facilitation of reflective practice, such as providing feedback or asking reflective questions regarding decisions related to laboratory ordering or prescribing to give trainees insight into their past and current behavior (56 articles); and (3) creation of a supportive environment in which the organization of the health care system, the presence of role models of delivering high-value, cost-conscious care, and a culture of high-value, cost-conscious care reinforce the desired training goals (27 articles). CONCLUSIONS AND RELEVANCE Research on educating physicians to deliver high-value, cost-conscious care suggests that learning by practicing physicians, resident physicians, and medical students is promoted by combining specific knowledge transmission, reflective practice, and a supportive environment. These factors should be considered when educational interventions are being developed.
Medical Education | 2012
Christopher Watling; Erik W. Driessen; Cees van der Vleuten; Meredith Vanstone; Lorelei Lingard
Medical Education 2012:46:593–603