Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Felicitas-Maria Lahner is active.

Publication


Featured researches published by Felicitas-Maria Lahner.


Medical Teacher | 2017

Factors influencing the educational impact of Mini-CEX and DOPS: A qualitative synthesis

Andrea Carolin Lörwald; Felicitas-Maria Lahner; Robert Greif; Christoph Berendonk; John J. Norcini; Sören Huwendiek

Abstract Introduction: The educational impact of Mini-CEX and DOPS varies greatly and can be influenced by several factors. However, there is no comprehensive analysis and synthesis of the described influencing factors. Methods: To fill this gap, we chose a two-step approach. First, we performed a systematic literature review and selected articles describing influencing factors on the educational impact of Mini-CEX and DOPS. Second, we performed a qualitative synthesis of these factors. Results: Twelve articles were included, which revealed a model consisting of four themes and nine subthemes as influencing factors. The theme context comprises “time for Mini-CEX/DOPS” and “usability of the tools”, and influences the users. The theme users comprises “supervisors’ knowledge about how to use Mini-CEX/DOPS”, “supervisors’ attitude to Mini-CEX/DOPS”, “trainees’ knowledge about Mini-CEX/DOPS”, and “trainees’ perception of Mini-CEX/DOPS”. These influence the implementation of Mini-CEX and DOPS, including “observation” and “feedback”. The theme implementation directly influences the theme outcome, which, in addition to the educational impact, encompasses “trainees’ appraisal of feedback”. Conclusions: Our model of influencing factors might help to further improve the use of Mini-CEX and DOPS and serve as basis for future research.


PLOS ONE | 2018

The educational impact of Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) and its association with implementation: A systematic review and meta-analysis

Andrea Carolin Lörwald; Felicitas-Maria Lahner; Zineb Miriam Nouns; Christoph Berendonk; John J. Norcini; Robert Greif; Sören Huwendiek

Introduction Mini Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) are used as formative assessments worldwide. Since an up-to-date comprehensive synthesis of the educational impact of Mini-CEX and DOPS is lacking, we performed a systematic review. Moreover, as the educational impact might be influenced by characteristics of the setting in which Mini-CEX and DOPS take place or their implementation status, we additionally investigated these potential influences. Methods We searched Scopus, Web of Science, and Ovid, including All Ovid Journals, Embase, ERIC, Ovid MEDLINE(R), and PsycINFO, for original research articles investigating the educational impact of Mini-CEX and DOPS on undergraduate and postgraduate trainees from all health professions, published in English or German from 1995 to 2016. Educational impact was operationalized and classified using Barr’s adaptation of Kirkpatrick’s four-level model. Where applicable, outcomes were pooled in meta-analyses, separately for Mini-CEX and DOPS. To examine potential influences, we used Fisher’s exact test for count data. Results We identified 26 articles demonstrating heterogeneous effects of Mini-CEX and DOPS on learners’ reactions (Kirkpatrick Level 1) and positive effects of Mini-CEX and DOPS on trainees’ performance (Kirkpatrick Level 2b; Mini-CEX: standardized mean difference (SMD) = 0.26, p = 0.014; DOPS: SMD = 3.33, p<0.001). No studies were found on higher Kirkpatrick levels. Regarding potential influences, we found two implementation characteristics, “quality” and “participant responsiveness”, to be associated with the educational impact. Conclusions Despite the limited evidence, the meta-analyses demonstrated positive effects of Mini-CEX and DOPS on trainee performance. Additionally, we revealed implementation characteristics to be associated with the educational impact. Hence, we assume that considering implementation characteristics could increase the educational impact of Mini-CEX and DOPS.


Medical Teacher | 2018

The Authors reply: Factors influencing the educational impact of mini-CEX and DOPS

Andrea Carolin Lörwald; Daniel Bauer; Felicitas-Maria Lahner; Robert Greif; Christoph Berendonk; John J. Norcini; Sören Huwendiek

We thank Evans and Lambrinudi for sharing their thoughts and ideas on our article “Factors influencing the educational impact of Mini-CEX and DOPS: A qualitative synthesis” (L€ orwald et al. 2017). One aspect in their letter struck us as particularly noteworthy. The authors observe how in the program in which they are enrolled, students regularly reflect on their clinical encounters from their first year of training, until this reflection seemingly becomes almost a second nature to them. They then hypothesize that this “entrenched need to reflect on all experiences” improves their attitudes and perceptions toward completing assessments such as Mini-CEX and DOPS. There are, in fact, hints in the literature that support their hypothesis. Back in 1986, Dweck studied how motivational processes affect learning (Dweck 1986). She found that one’s goal orientation determines one’s learning behavior. She distinguished between two kinds of goal orientation: people with a performance goal orientation aim to receive positive judgments and avoid criticism; people with a learning goal orientation instead aim to increase their competence. This translates to the assumption that students with a learning goal orientation persistently seek new challenges, while students with a performance goal orientation might prefer easy tasks for their workplace-based assessment in order to demonstrate their competency and avoid new challenges. Reflecting upon clinical encounters might foster students’ learning goal orientation, encourage them to seek new challenges and welcome feedback, and increase the impact of Mini-CEX and DOPS (Sargeant et al. 2009; Ramani et al. 2018). We would like to congratulate the authors, their teachers and their program directors for having achieved a program that fosters reflection and nourishes students’ feedback-seeking behavior.


Medical Education | 2018

When predicting item difficulty, is it better to ask authors or reviewers?

Claudia Kiessling; Andreas Winkelmann; Felicitas-Maria Lahner; Daniel Bauer

questions do we need to ask now in order to move the OES towards the collective vision? What relevant and credible information is needed to answer these questions? Step 5 pulled together the relevant information and a ‘learning huddle’ was held with members of the core team. The learning huddle had three objectives: (i) to make meaning of the information collected, (ii) to discuss what the core team is noticing or sensing in the system as a result of the work of the OES (emergent outcomes) and (iii) to identify any new or relevant questions that the core team thought important to ask in order to move the OES towards the collective vision. Step 6 involved ‘refreshing’ the evaluation based on the learning huddle discussion and Step 7 was the sharing of the evaluation findings to date with relevant stakeholder groups. What lessons were learned? The advantages to using this process include gaining a comprehensive understanding of the true value of the OES, making informed strategic decisions with confidence, and growing capacity within the department for learning and inquiry. That said, this evaluation approach should only be used in situations where the following is true: the primary focus of the work is programme growth and improvement (rather than solely accountability), there is flexibility to change or adapt the programme, there is a core team of people who can commit to the process as it requires dedication over a long period of time, and the culture of the organisation is one that is open to making mistakes and learning from them.


Archive | 2016

Stabile Antwortmuster bei Script Concordance Test Fragen in der Schweizer Facharztprüfung Allgemeine Innere Medizin

Daniel Stricker; Felicitas-Maria Lahner; Raphael Bonvin; Christoph Berendonk

Fragestellung: Mit dem Script Concordance Test (SCT) soll die Fähigkeit zu klinischem Denken (clinical reasoning) geprüft werden [1], [2]. Jedoch wird das Fragenformat u.a. kritisiert, weil die Messzuverlässigkeit schwierig zu überprüfen ist. Insbesondere fehlen Angaben zu test-retest Reliabilitäten [3]. Ziel der vorliegenden Studie ist es, die Stabilität der Antwortmuster auf SCT Fragen zu untersuchen.


Archive | 2015

Developing an alternative response format for the script concordance test

Felicitas-Maria Lahner; Zineb Miriam Nouns; Sören Huwendiek

Introduction: Clinical reasoning is essential for the practice of medicine. In theory of development of medical expertise it is stated, that clinical reasoning starts from analytical processes namely the storage of isolated facts and the logical application of the ‘rules’ of diagnosis. Then the learners successively develop so called semantic networks and illness-scripts which finally are used in an intuitive non-analytic fashion [1], [2]. The script concordance test (SCT) is an example for assessing clinical reasoning [3]. However the aggregate scoring [3] of the SCT is recognized as problematic [4]. The SCT`s scoring leads to logical inconsistencies and is likely to reflect construct-irrelevant differences in examinees’ response styles [4]. Also the expert panel judgments might lead to an unintended error of measurement [4]. In this PhD project the following research questions will be addressed: 1. How does a format look like to assess clinical reasoning (similar to the SCT but) with multiple true-false questions or other formats with unambiguous correct answers, and by this address the above mentioned pitfalls in traditional scoring of the SCT? 2. How well does this format fulfill the Ottawa criteria for good assessment, with special regards to educational and catalytic effects [5]? Methods: 1. In a first study it shall be assessed whether designing a new format using multiple true-false items to assess clinical reasoning similar to the SCT-format is arguable in a theoretically and practically sound fashion. For this study focus groups or interviews with assessment experts and students will be undertaken. 2. In an study using focus groups and psychometric data Norcini`s and colleagues Criteria for Good Assessment [5] shall be determined for the new format in a real assessment. Furthermore the scoring method for this new format shall be optimized using real and simulated data.


Medical Teacher | 2018

Influences on the implementation of Mini-CEX and DOPS for postgraduate medical trainees’ learning: A grounded theory study

Andrea Carolin Lörwald; Felicitas-Maria Lahner; Bettina Mooser; Martin Perrig; Matthias Widmer; Robert Greif; Sören Huwendiek


Advances in Health Sciences Education | 2018

Multiple true–false items: a comparison of scoring algorithms

Felicitas-Maria Lahner; Andrea Carolin Lörwald; Daniel Bauer; Zineb Miriam Nouns; René Krebs; Sissel Guttormsen; Martin R. Fischer; Sören Huwendiek


Archive | 2017

Zuverlässigkeit von Bestehens-/Nichtbestehensentscheidungen bei Multiple Choice Prüfungen: konditionale Reliabilität vs Cronbachs Alpha

Felicitas-Maria Lahner; Andrea Carolin Lörwald; Sissel Guttormsen; Martin R. Fischer; Sören Huwendiek


Archive | 2017

Socio-cultural influences on residents' learning with Mini-CEX and DOPS: a grounded theory study

Andrea Carolin Lörwald; Felicitas-Maria Lahner; Bettina Mooser; Christoph Berendonk; Matthias Widmer; Martin Perrig; Robert Greif; Sören Huwendiek

Collaboration


Dive into the Felicitas-Maria Lahner's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge