Debra Kiegaldie
Monash University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Debra Kiegaldie.
BMC Medical Education | 2016
Christina Johnson; Jennifer L. Keating; David Boud; Megan Dalton; Debra Kiegaldie; Margaret Hay; Barry P. McGrath; Wendy A. McKenzie; Kichu Nair; Debra Nestel; Claire Palermo; Elizabeth Molloy
BackgroundHealth professions education is characterised by work-based learning and relies on effective verbal feedback. However the literature reports problems in feedback practice, including lack of both learner engagement and explicit strategies for improving performance. It is not clear what constitutes high quality, learner-centred feedback or how educators can promote it. We hoped to enhance feedback in clinical practice by distinguishing the elements of an educator’s role in feedback considered to influence learner outcomes, then develop descriptions of observable educator behaviours that exemplify them.MethodsAn extensive literature review was conducted to identify i) information substantiating specific components of an educator’s role in feedback asserted to have an important influence on learner outcomes and ii) verbal feedback instruments in health professions education, that may describe important educator activities in effective feedback. This information was used to construct a list of elements thought to be important in effective feedback. Based on these elements, descriptions of observable educator behaviours that represent effective feedback were developed and refined during three rounds of a Delphi process and a face-to-face meeting with experts across the health professions and education.ResultsThe review identified more than 170 relevant articles (involving health professions, education, psychology and business literature) and ten verbal feedback instruments in health professions education (plus modified versions). Eighteen distinct elements of an educator’s role in effective feedback were delineated. Twenty five descriptions of educator behaviours that align with the elements were ratified by the expert panel.ConclusionsThis research clarifies the distinct elements of an educator’s role in feedback considered to enhance learner outcomes. The corresponding set of observable educator behaviours aim to describe how an educator could engage, motivate and enable a learner to improve. This creates the foundation for developing a method to systematically evaluate the impact of verbal feedback on learner performance.
Medical Education | 2017
Jessica Kaplonyi; Kelly-Ann Bowles; Debra Nestel; Debra Kiegaldie; Stephen Maloney; Terry P. Haines; Cylie Williams
Effective communication skills are at the core of good health care. Simulated patients (SPs) are increasingly engaged as an interactive means of teaching, applying and practising communication skills with immediate feedback. There is a large body of research into the use of manikin‐based simulation but a gap exists in the body of research on the effectiveness of SP‐based education to teach communication skills that impact patient outcomes. The aim of this systematic review was to critically analyse the existing research, investigating whether SP‐based communication skills training improves learner–patient communication, how communication skill improvement is measured, and who measures these improvements.
The Clinical Teacher | 2011
Geoff White; Debra Kiegaldie
Background: This article is a response to expressions of concern from a range of sources, including reports of curriculum redesign to accommodate the characteristics of Gen Y, claims made in the press and concerns expressed by educators in the health professions. Are these concerns grounded in research and if so how can educators in the health professions respond?
The Clinical Teacher | 2016
Samantha Lee Sevenhuysen; Terry P. Haines; Debra Kiegaldie; Elizabeth Molloy
Editors’ note: Many different approaches to collaborative and peer-assisted learning (CPAL), in which students and trainees learn together, and may also teach each other, have been described in the health care professional education literature. There is considerable evidence that CPAL can enhance learning, confi dence, and interpersonal and teaching skills, but such approaches often require careful planning, resourcing and support to be implemented in a way that maximises these potential benefi ts, and minimises risk. In this Toolbox article, the authors draw on their experience of implementing CPAL approaches in undergraduate health professional education, and offer advice, suggestions and practical examples for other clinical teachers who want to implement new or enhance existing CPAL initiatives. Using a recognised model of behavioural change, they encourage readers to: assess and develop their capability for CPAL through refl ection on learner prior experiences, training and resources; create opportunities for learners to take part in CPAL by selecting the most appropriate format, structure and activities for peer interaction; and motivate learners to engage by ensuring effective orientation, guidance, and continued evaluation and development. We hope that clinical teachers will be able to apply these insights and examples in their own context, and that as a result more students and trainees will learn collaboratively, and will develop their professional collaboration, feedback and teaching skills in this way.
The Clinical Teacher | 2014
Alana Gilbee; Julie Baulch; Michelle Theresa Leech; Michelle Rose Levinson; Debra Kiegaldie; Kerry Lee Hood
Opportunities for interprofessional learning (IPL) and the promotion of interprofessional (IP) communication at the undergraduate level are important goals of health science faculties. IPL activities with shared curriculum validity to promote full student engagement can be challenging to identify. Case presentations that focus on patient‐centred learning are one type of activity that is likely to have clinical relevance to all undergraduate groups. Guiding students and facilitators on this approach using a structured framework is necessary to maximise the desired IPL outcomes.
BMJ Open | 2016
Cylie Williams; Kelly-Ann Bowles; Debra Kiegaldie; Stephen Maloney; Debra Nestel; Jessica Kaplonyi; Terry P. Haines
Introduction Simulation-based education (SBE) is now commonly used across health professional disciplines to teach a range of skills. The evidence base supporting the effectiveness of this approach for improving patient health outcomes is relatively narrow, focused mainly on the development of procedural skills. However, there are other simulation approaches used to support non-procedure specific skills that are in need of further investigation. This cluster, cross-over randomised controlled trial with a concurrent economic evaluation (cost per fall prevented) trial will evaluate the effectiveness, cost-effectiveness and student experience of health professional students undertaking simulation training for the prevention of falls among hospitalised inpatients. This research will target the students within the established undergraduate student placements of Monash University medicine, nursing and allied health across Peninsula Health acute and subacute inpatient wards. Methods and analysis The intervention will train the students in how to provide the Safe Recovery program, the only single intervention approach demonstrated to reduce falls in hospitals. This will involve redevelopment of the Safe Recovery program into a one-to-many participant SBE program, so that groups of students learn the communication skills and falls prevention knowledge necessary for delivery of the program. The primary outcome of this research will be patient falls across participating inpatient wards, with secondary outcomes including student satisfaction with the SBE and knowledge gain, ward-level practice change and cost of acute/rehabilitation care for each patient measured using clinical costing data. Ethics and dissemination The Human Research Ethics Committees of Peninsula Health (LRR/15/PH/11) and Monash University (CF15/3523-2015001384) have approved this research. The participant information and consent forms provide information on privacy, storage of results and dissemination. Registration of this trial has been completed with the Australian and New Zealand Clinical Trials Registry: ACTRN12615000817549. This study protocol has been prepared according to the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) checklist. Trial registration number ACTRN12615000817549; Pre-results.
Journal of Educational Multimedia and Hypermedia | 2006
Debra Kiegaldie; Geoff White
EdMedia: World Conference on Educational Media and Technology | 2004
Debra Kiegaldie; Geoff White
Journal of Interprofessional Education and Practice | 2016
Debra Kiegaldie; Elizabeth Pryor; Stuart Marshall; Dean Everard; Rick Iedema; Simon Craig; Alana Gilbee
EdMedia: World Conference on Educational Media and Technology 2004 | 2004
Thomas Henry Jeavons; Debra Kiegaldie; Ralph Arwas; Wayne Sturrock