Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Duncan David Nulty is active.

Publication


Featured researches published by Duncan David Nulty.


Assessment & Evaluation in Higher Education | 2008

The adequacy of response rates to online and paper surveys: what can be done?

Duncan David Nulty

This article is about differences between, and the adequacy of, response rates to online and paper‐based course and teaching evaluation surveys. Its aim is to provide practical guidance on these matters. The first part of the article gives an overview of online surveying in general, a review of data relating to survey response rates and practical advice to help boost response rates. The second part of the article discusses when a response rate may be considered large enough for the survey data to provide adequate evidence for accountability and improvement purposes. The article ends with suggestions for improving the effectiveness of evaluation strategy. These suggestions are: to seek to obtain the highest response rates possible to all surveys; to take account of probable effects of survey design and methods on the feedback obtained when interpreting that feedback; and to enhance this action by making use of data derived from multiple methods of gathering feedback.


Assessment & Evaluation in Higher Education | 2009

How to use (five) curriculum design principles to align authentic learning environments, assessment, students’ approaches to thinking and learning outcomes

N. M. Meyers; Duncan David Nulty

In this article, we articulate five principles of curriculum design and illustrate their application in a third‐year undergraduate course for environmental and ecological scientists. In this way, we provide a practical framework for others wishing to enhance their students’ learning. To apply the five principles, we created a learning environment consisting of a broad range of learning resources and activities which were structured and sequenced with an integrated assessment strategy. The combined effect of this ensured alignment between the learning environment we created, the thinking approaches students used and the learning outcomes they achieved. More specifically, the assessment activities guided students by requiring them to recognise when their understanding was limited – and then to engage them in thinking approaches that would develop their understanding further. By providing a framework of thoughts, ideas and information, we sought to progressively enhance the sophistication of our learners’ thinking. Thus, the assessment required students to integrate, synthesise and construct their understandings in ways consistent with the discipline and the professional pathways on which they had embarked. We intend that this illustration will act as a guide to other academics to adopt the same principles in their teaching.


Nurse Education Today | 2009

The objective structured clinical examination (OSCE): optimising its value in the undergraduate nursing curriculum.

Marion Mitchell; Amanda Henderson; Michele Groves; Megan Dalton; Duncan David Nulty

This article explores the use of the objective structured clinical examination (OSCE) in undergraduate nursing education. The advantages and limitations of this assessment approach are discussed and various applications of the OSCE are described. Attention is given to the complexities of evaluating some psychosocial competency components. The issues are considered in an endeavour to delineate the competency components, or skill sets, that best lend themselves to assessment by the OSCE. We conclude that OSCEs can be used most effectively in nurse undergraduate curricula to assess safe practice in terms of performance of psychomotor skills, as well as the declarative and schematic knowledge associated with their application. OSCEs should be integrated within a curriculum in conjunction with other relevant student evaluation methods.


Assessment & Evaluation in Higher Education | 2011

Peer and self‐assessment in the first year of university

Duncan David Nulty

This paper reviews the literature about peer and self‐assessment in university courses from the point of view of their use, and the suitability of their use, in the first year of university study. The paper is divided into three parts. The first part argues that although first‐year students are involved in many of the studies that report on the use of peer and self‐assessment in higher education, the proportion of these studies that do so is somewhat less than in other year levels. In addition, relatively little of this work directly and explicitly discusses the suitability of peer and self‐assessment for students and courses at this year level. The second part of the paper provides an introductory exploration of the relationship between peer and self‐assessment, and specific features of first‐year assessment, learning and teaching. Three issues relating directly to the suitability of peer and self‐assessment in the first year are explored. In the third part, the paper briefly discusses the desirability of implementing peer and self‐assessment, in general, before seeking to extend this specifically to the first year. The paper concludes by recommending that greater use can and should be made of peer and self‐assessment in the first year of university study.


Assessment & Evaluation in Higher Education | 2009

Promoting and recognising excellence in the supervision of research students: an evidence‐based framework

Duncan David Nulty; Margaret Kiley; N. M. Meyers

One issue universities face is the need to demonstrate excellence in postgraduate research supervision at the individual, faculty and university level. While poor supervision might become obvious over time, with grievances, withdrawals and poor completion times and rates, this paper focuses specifically on identifying and demonstrating supervisory excellence. Currently, the amount and range of evidence used to support claims of supervisory excellence tends to be limited, leaving supervisors, faculties and institutions in a position where demonstrating excellence remains difficult. This paper proposes two inter‐dependent ideas which, considered together, help to redress this problem. The first is a ‘map’ for the collection and use of evidence of supervisory excellence. The second is a ‘template’ for a ‘supervisory excellence report’. The ‘map’ details the organisational elements, uses of data, and data types which can be considered. The ‘report’ explains one simple and potent way to organise and present these data for multiple purposes. Together they constitute a much‐needed framework for promoting and recognising excellence in the supervision of research students.


Nurse Education Today | 2013

An implementation framework for using OSCEs in nursing curricula

Amanda Henderson; Duncan David Nulty; Marion Mitchell; Carol Jeffrey; Michelle Kelly; Michele Groves; Pauline Glover; Sabina Knight

The implementation framework outlined in this paper has been developed from feedback of a trial across three different nursing and midwifery programmes and is designed to assist educators to incorporate OSCEs within their curricula. There is value in flagging the pedagogical principles embodied in the framework and alerting educators to their importance for more meaningful student learning. For each step practical advice is provided contributing to the utility of this approach. Considerations are systematic ensuring that the use of OSCEs in health care curricula assures judicious use of resources to achieve desired student outcomes.


Journal of Clinical Nursing | 2015

Critical factors about feedback: ‘They told me what I did wrong; but didn't give me any feedback’

Michele Groves; Marion Mitchell; Amanda Henderson; Carol Jeffrey; Michelle Kelly; Duncan David Nulty

Aim This study reports nursing and midwifery undergraduate and postgraduate students’ perceptions of feedback during their participation in a performance-based learning activity, either an Objective Structured Clinical Examination (OSCE) for patient assessment or a simulation focussed on communication skills. Background Providing feedback to students is critical to learning. The definition and process of giving feedback has significantly progressed since its initial concept of simply advising learners whether an answer to a test item was right or wrong (Kulhavy 1977). Feedback is now conceived more broadly and used throughout the learning process. By providing students with a snap-shot of their current ability and advice, feedback helps to define learning goals more clearly, increases achievement and influences learning style (Sadler 1989). Feedback cultivates reflective practice and develops expertise (Albanese 2006). This is especially so in work-based learning where the provision of immediate feedback on performance can particularly enhance applied learning. The nature of feedback varies widely and includes formative assessment by teachers and peers, and summative assessment required for academic progression. The most effective feedback is constructive. That is, it should focus on the task being assessed, include strengths as well as weaknesses of performance and suggest strategies for performance improvement. However, its effectiveness is also dependent on factors such as format, timing and the perceived expertise of the provider (Hattie & Timperley 2007, Murdoch-Eaton & Sargeant 2012). Additionally, receptiveness to, and type of feedback preferred, varies with the maturity and life experience of the learner, for example, beginning medical students have indicated a preference for positive, re-assuring feedback whereas senior students preferred immediate verbal feedback (Murdoch-Eaton & Sargeant 2012). Design Student perceptions of feedback were collected across four educational settings: two undergraduate nursing programmes, one undergraduate midwifery programme and a postgraduate course for rural and remote healthcare nurses where students’ learning was centred on a practice based activity, either an OSCE or simulation session. The OSCE consisted of one scenario that required students to undertake an integrated patient assessment while the simulation session focussed on communication skills with students alternately playing the roles of patient, carer, nurse, etc. In all settings, the activities were for formative assessment and students received feedback from teaching staff. Additionally, students were encouraged to organise informal practice sessions and obtain peer feedback. Method Data were collected via open-ended questions on student surveys (n = 557) and student participation in focus group discussions (n = 91) within one week of participation in the learning activity. Thematic analysis was conducted on text from surveys and transcripts of focus groups discussions. Results Overall, students found the feedback they received to be beneficial to their learning regardless of their role in the practice based activity or whether they received individual or group feedback. However, three specific themes emerged from the data analysis. These related to the value of feedback for learning, students’ perception of the nature of feedback, and the need for consistency in giving feedback (see Table 1): 1 The value of feedback for learning. Students appreciated receiving detail regarding the positive aspects of their practice and areas in which they could improve. However, there was variable appreciation of peer feedback by students, some of whom felt that their colleagues’ lack of expertise limited the opportunity for effective learning. 2 Limited understanding of what constitutes feedback. There was evidence of limited understanding by some students about what actually constitutes feedback. This included the perception that feedback is always positive and different to simply correcting mistakes; another was that quantity was more important than quality. A small minority of the 557 students commented that they could only learn through ‘doing’ rather than ‘observing’ and that feedback given to others in a group setting was by definition, not applicable to them. Students in year one indicated that they were only informed about what they were doing wrong. They valued positive feedback in the form of reassurance rather than negative comments. 3 Issues to do with consistency in the quality and delivery of feedback. Some of the student comments indicated that there were differences in how staff gave feedback during teaching sessions. Sometimes this resulted in conflicts or contradictions in the performance of techniques. Students highlighted the need for a uniform approach to teaching and giving feedback. Conclusion These findings provide important insights into perceptions of feedback, its effectiveness in promoting learning, student perceptions of what feedback is, and their receptiveness to different types of feedback, specifically in clinical practice situations. In particular, it supports other recent work that identifies that students at the beginning of their course of study understand feedback as positive affirmation. This is in contrast to more experienced and postgraduate students who value detailed statements about how they can improve (Murdoch-Eaton & Sargeant 2012). Relevance to clinical practice • Both staff and students need to have a common understanding of the nature and various forms of feedback. In addition to the well-recognised features of quality feedback (timely, specific, constructive and the like), this study has highlighted the need to address two underpinning issues before embarking on the feedback process, namely: ○ That all teaching staff should be trained in giving give consistent and effective feedback in a way that will be most useful for students; and ○ That information should be provided to students about what constitutes feedback and how it can best be used to improve learning. • The negative view of peer feedback by some students suggest that adequate preparation is important and should carefully consider: ○ The purpose of the feedback (e.g. to be used following expert feedback to check technique or as revision in the lead-up to summative assessment); ○ Where and how it is to be given. Disclosure The authors have confirmed that all authors meet the ICMJE criteria for authorship credit (www.icmje.org/ ethical_1author.html), as follows: (1) substantial contributions to conception and design of, or acquisition of data or analysis and interpretation of data, (2) drafting the article or revising it critically for important intellectual content, and (3) final approval of the version to be published. Funding & Ethics This project was funded by Australian Government through the Office of Learning and Teaching, Department of Education, Employment and Workplace Relations. Ethical approval was obtained from the Ethics Review Committees of all participating institutions.


Journal of Teaching in Travel & Tourism | 2009

Facilitating deep learning in an information systems course through application of curriculum design principles

Glen Matthew Hornby; Gayle Ruth Jennings; Duncan David Nulty

This article reports on the incremental improvement of assessment, learning, and teaching activities in a large first-year undergraduate course. The changes, made over 3 years, resulted in the implementation of a student-centered (though individual) assessment strategy that included students in developing and applying the assessment criteria themselves. The outcome was a student-centered course design that required students to engage in deep approaches to learning. Using an action research framework, Meyers and Nultys (2008) five curriculum design principles for facilitating deep approaches to learning (the development of which was guided by Biggss [2003] 3P model) are used to illustrate how the course was incrementally improved to facilitate deep learning approaches. The article provides an illustration of how others may pursue similar curriculum design improvements adapted for their own contexts.


Archive | 2015

Aligning Student Attitudes, Assessment, and Curriculum Design: A Case Study Using the “My Life as a Musician” Vocational Preparation Strand

Diana Tolmie; Duncan David Nulty

Following a review of the Bachelor of Music program, the Queensland Conservatorium Griffith University introduced a vocational preparation strand, My Life as a Musician (MLaaM) in 2011. This is a sequence of compulsory courses offered for one semester each year for the duration of the Bachelor of Music (three or four years), and Bachelor of Music Technology (3 year) degree. It includes a suite of tasks ranging from identifying personal career development and planning, small-to-medium business enterprise skills, to creative entrepreneurship and new venture management.


Nurse Education Today | 2011

Best Practice Guidelines for use of OSCEs: Maximising value for student learning

Duncan David Nulty; Marion Mitchell; Carol Jeffrey; Amanda Henderson; Michele Groves

Collaboration


Dive into the Duncan David Nulty's collaboration.

Top Co-Authors

Avatar

Amanda Henderson

Princess Alexandra Hospital

View shared research outputs
Top Co-Authors

Avatar

Michele Groves

University of Queensland

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marion Mitchell

Princess Alexandra Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Leonie Marjorie Short

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge