Janke Cohen-Schotanus
University Medical Center Groningen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Janke Cohen-Schotanus.
Medical Teacher | 2007
Leo C. Aukes; Jelle Geertsma; Janke Cohen-Schotanus; Rein Zwierstra; Joris P. J. Slaets
Aim: Personal reflection is important for acquiring, maintaining and enhancing balanced medical professionalism. A new scale, the Groningen Reflection Ability Scale (GRAS), was developed to measure the personal reflection ability of medical students. Method: Explorative literature study was conducted to gather an initial pool of items. Item selection took place using qualitative and quantitative methods. Medical teachers screened the initial item-pool on relevance, expert-analysis was used for screening the fidelity to the criterion and large samples of medical students and medical teachers were used to investigate the psychometric characteristics of the items. Finally, explorative factor analysis was used to investigate the structure of the scale. Results: The psychometric quality and content validity of the GRAS are satisfactory. The items cover three aspects of personal reflection: self-reflection, empathetic reflection and reflective communication. The 23-item scale proved to be easy to complete and to administer. Conclusion: The GRAS is a practical measurement instrument that yields reliable data that contribute to valid inferences about the personal reflection ability of medical students and doctors, both at individual and group level.
Medical Education | 2009
Henk G. Schmidt; Janke Cohen-Schotanus; Lidia R. Arends
Objectivesu2002 We aimed to study the effects of active‐learning curricula on graduation rates of students and on the length of time needed to graduate.
Medical Education | 2008
Janke Cohen-Schotanus; Arno M. M. Muijtjens; Johanna Schönrock-Adema; Jelle Geertsma; Cees van der Vleuten
Objectiveu2002 To test hypotheses regarding the longitudinal effects of problem‐based learning (PBL) and conventional learning relating to students’ appreciation of the curriculum, self‐assessment of general competencies, summative assessment of clinical competence and indicators of career development.
Medical Teacher | 2004
Cpm van der Vleuten; Lwt Schuwirth; Amm Muijtjens; Ajnm Thoben; Janke Cohen-Schotanus; Cpa Van Boven
The practice of assessment is governed by an interesting paradox. On the one hand good assessment requires substantial resources which may exceed the capacity of a single institution and we have reason to doubt the quality of our in-house examinations. On the other hand, our parsimonity with regard to our resources makes us reluctant to pool efforts and share our test material. This paper reports on an initiative to share test material across different medical schools. Three medical schools in The Netherlands have successfully set up a partnership for a specific testing method: progress testing. At present, these three schools collaboratively produce high-quality test items. The jointly produced progress tests are administered concurrently by these three schools and one other school, which buys the test. The steps taken in establishing this partnership are described and results are presented to illustrate the unique sort of information that is obtained by cross-institutional assessment. In addition, plans to improve test content and procedure and to expand the partnership are outlined. Eventually, the collaboration may even extend to other test formats. This article is intended to give evidence of the feasibility and exciting potential of between school collaboration in test development and test administration. Our experiences have demonstrated that such collaboration has excellent potential to combine economic benefit with educational advantages, which exceed what is achievable by individual schools.
Medical Education | 2006
Janke Cohen-Schotanus; Arno M. M. Muijtjens; Jan J. Reinders; Jessica Agsteribbe; Herman J. M. van Rossum; Cees van der Vleuten
Purposeu2002 To ascertain whether the grade point average (GPA) of school‐leaving examinations is related to study success, career development and scientific performance. The problem of restriction of range was expected to be partially reduced due to the use of a national lottery system weighted in favour of students with higher GPAs.
Medical Teacher | 1999
Janke Cohen-Schotanus
Assessment influences cognitive and operant aspects of learning. Cognitive aspects of learning are influenced by the content of assessment: what and how do students learn. Usually students only study the educational objectives which will be assessed. If teachers want students to meet all the objectives of a curriculum all these objectives have to be assessed. The operant aspects of learning concern when and how much students learn. How teachers and faculty can influence operant aspects of learning is discussed in this article. Dutch research has shown that the planning of exams and the examination rules are strong factors influencing operant learning. Exams should be programmed regularly, and the examination rules have to be fair but strict. The rule of thumb is that the stricter the rules, the better the examination results. The conclusion is that integration of these factors in assessment procedures will drive student learning in a positive way.
Medical Teacher | 2009
Johanna Schönrock-Adema; Marjolein Heijne-Penninga; Elisabeth A. van Hell; Janke Cohen-Schotanus
Background: The validation of educational instruments, in particular the employment of factor analysis, can be improved in many instances. Aims: To demonstrate the superiority of a sophisticated method of factor analysis, implying an integration of recommendations described in the factor analysis literature, over often employed limited applications of factor analysis. We demonstrate the essential steps, focusing on the Postgraduate Hospital Educational Environment Measure (PHEEM). Method: The PHEEM was completed by 279 clerks. We performed Principal Component Analysis (PCA) with varimax rotation. A combination of three psychometric criteria was applied: scree plot, eigenvalues >1.5 and a minimum percentage of additionally explained variance of approximately 5%. Furthermore, four interpretability criteria were used. Confirmatory factor analysis was performed to verify the original scale structure. Results: Our method yielded three interpretable and practically useful dimensions: learning content and coaching, beneficial affective climate and external regulation. Additionally, combining several criteria reduced the risk of overfactoring and underfactoring. Furthermore, the resulting dimensions corresponded with three learning functions essential to high-quality learning, thus strengthening our findings. Confirmatory factor analysis disproved the original scale structure. Conclusions: Our sophisticated approach yielded several advantages over methods applied in previous validation studies. Therefore, we recommend this method in validation studies to achieve best practice.
Medical Education | 2007
Johanna Schönrock-Adema; Marjolein Heijne-Penninga; Marijtje van Duijn; Jelle Geertsma; Janke Cohen-Schotanus
Objectivesu2002 To examine whether peer assessment can enhance scores on professional behaviour, with the expectation that students who assess peers score more highly on professional behaviour than students who do not assess peers.
Medical Education | 2008
Elisabeth A. van Hell; Jan B. M. Kuks; Johanna Schönrock-Adema; Mirjam T. van Lohuizen; Janke Cohen-Schotanus
Contextu2002 Many students experience a tough transition from pre‐clinical to clinical training and previous studies suggest that this may constrict students’ progress. However, clear empirical evidence of this is lacking. The aim of this study was to determine: whether the perceived difficulty of transition influences student performance during the first 2u2003weeks of clerkships; whether it influences students’ overall performance in their first clerkship, and the degree to which the difficulty of transition is influenced by students’ pre‐clinical knowledge and skills levels.
Medical Teacher | 2010
Janke Cohen-Schotanus; Cees van der Vleuten
Background: Teachers involved in test development usually prefer criterion-referenced standard setting methods using panels. Since expert panels are costly, standards are often set by a pre-fixed percentage of questions answered correctly or norm-referenced methods aimed at ranking examinees. Aim: To discuss the (dis)advantages of commonly used criterion and norm-referenced methods and present a new compromise method: standards based on a fixed cut-off score using the best scoring students as reference point. Methods: Historical data from 54 Maastricht (norm-referenced) and 52 Groningen (criterion-referenced) tests were used to demonstrate huge discrepancies and variability in cut-off scores and failure rates. Subsequently, the compromise model – known as Cohens method – was applied to the Groningen tests. Results: The Maastricht norm-referenced method led to a large variation in required cut-off scores (15–46%), but a stable failure rate (about 17%). The Groningen method with a conventional, pre-fixed standard of 60% led to a large variation in failure rates (17–97%). The compromise method reduced variation in required cut-off scores as well as failure rates. Conclusion: Both the criterion and norm-referenced standards, used in practice, have disadvantages. The proposed compromise model reduces the disadvantages of both methods and is considered more acceptable. Last but not least, compared to standard setting methods using panels, this method is affordable.