Clinton Golding
University of Otago
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Clinton Golding.
Higher Education Research & Development | 2011
Clinton Golding
This paper presents one method for educating for critical thinking in Higher Education. It elaborates Richard Paul’s method of Socratic questioning to show how students can learn to be critical thinkers. This method combines and uses the wider pedagogical and critical thinking literature in a new way: it emphasises a thinking‐encouraging approach where the academic teacher scaffolds students to think for themselves, rather than leading them to understand a body of knowledge, and is based on isolating and articulating critical thinking by ‘reverse engineering’ the questions expert critical thinkers ask. The result of using this method is that students will be immersed in the practice of making critical judgements where they will hone their critical skills, cultivate a critical character and begin to speak, act and think like expert critical thinkers.
Teaching in Higher Education | 2014
Rob Wass; Clinton Golding
Vygotskys Zone of Proximal Development (ZPD) provides an important understanding of learning, but its implications for teachers are often unclear or limited and could be further explored. We use conceptual analysis to sharpen the ZPD as a teaching tool, illustrated with examples from teaching critical thinking in zoology. Our conclusions are the following: teachers should assign tasks that students cannot do on their own, but which they can do with assistance; they should provide just enough assistance so that students learn to complete the tasks independently and, finally, teachers can increase learning gains by providing learning environments that enable students to do harder tasks than would otherwise be possible and by assigning the hardest tasks students can do with assistance. This analysis provides a sharp and useful tool for supporting learning across all curriculum areas.
Educational Philosophy and Theory | 2011
Clinton Golding
Although constructivist discussions in the classroom are often treated as if they were all of the same kind, in this paper I argue that there are subtle but important distinctions that need to be made. An analysis of these distinctions shows that there is a continuum of different constructivist discussions. At one extreme are teacher‐directed discussions where students are led to construct the ‘correct’ understanding of a pre‐decided conclusion; at the other extreme are unstructured discussions where students are free to construct any understanding. While there are many positions on the continuum, the middle ground is occupied by discussions that find a balance between teacher‐control and student‐independence, and between having set‐answers and a free‐for‐all. I argue that the Community of Inquiry is a useful conception of constructivist discussions in this middle ground.
BMC Medical Education | 2014
Clare Delany; Clinton Golding
BackgroundClinical reasoning is fundamental to all forms of professional health practice, however it is also difficult to teach and learn because it is complex, tacit, and effectively invisible for students. In this paper we present an approach for teaching clinical reasoning based on making expert thinking visible and accessible to students.MethodsTwenty-one experienced allied health clinical educators from three tertiary Australian hospitals attended up to seven action research discussion sessions, where they developed a tentative heuristic of their own clinical reasoning, trialled it with students, evaluated if it helped their students to reason clinically, and then refined it so the heuristic was targeted to developing each student’s reasoning skills. Data included participants’ written descriptions of the thinking routines they developed and trialed with their students and the transcribed action research discussion sessions. Content analysis was used to summarise this data and categorise themes about teaching and learning clinical reasoning.ResultsTwo overriding themes emerged from participants’ reports about using the ‘making thinking visible approach’. The first was a specific focus by participating educators on students’ understanding of the reasoning process and the second was heightened awareness of personal teaching styles and approaches to teaching clinical reasoning.ConclusionsWe suggest that the making thinking visible approach has potential to assist educators to become more reflective about their clinical reasoning teaching and acts as a scaffold to assist them to articulate their own expert reasoning and for students to access and use.
Assessment & Evaluation in Higher Education | 2016
Clinton Golding; Lee Adam
Many teachers in higher education use feedback from students to evaluate their teaching, but only some use these evaluations to improve their teaching. One important factor that makes the difference is the teacher’s approach to their evaluations. In this article, we identify some useful approaches for improving teaching. We conducted focus groups with award-winning university teachers who use student evaluations to improve their teaching, and we identified how they approach their evaluation data. We found that these teachers take a reflective approach, aiming for constant improvement, and see their evaluation data as formative feedback, useful for improving learning outcomes for their students. We summarise this as the improvement approach, and we offer it for other teachers to emulate. We argue that if teachers take this reflective, formative, student-centred approach, they can also use student evaluations to improve their teaching, and this approach should be fostered by institutions to encourage more teachers to use student evaluations to improve their teaching.
Assessment & Evaluation in Higher Education | 2015
Sharon Sharmini; Rachel Spronken-Smith; Clinton Golding; Tony Harland
In this article we explore how examiners assess a thesis that includes published work. An online survey was used to gather data on approaches to assessing publication-based theses (PBTs). The respondents were 62 supervisors who had experience examining PBTs across a range of disciplines at a research-intensive university in New Zealand. Nearly half of the respondents had examined ‘hybrid’ theses with papers inserted as chapters, 41% had examined theses with publications appended and 14% had examined PhDs by publication (i.e. papers alone). Twenty-nine per cent of examiners used their own extended set of criteria to assess PBTs, 48% found them easier to assess but 26% wanted more guidance. Our analysis also indicated that 86% of the examiners were highly influenced by publications in top-ranked journals and international peer-reviewed journals. Among the concerns of the examiners was the intellectual input of the candidate in any multi-authored publication, as well as the coherence of the thesis. We recommend the need for clearer guidelines for doctoral candidates, supervisors and examiners managing PBTs.
Assessment & Evaluation in Higher Education | 2014
Clinton Golding; Sharon Sharmini; Ayelet Lazarovitch
Although many articles have been written about thesis assessment, none provide a comprehensive, general picture of what examiners do as they assess a thesis. To synthesise this diverse literature, we reviewed 30 articles, triangulated their conclusions and identified 11 examiner practices. Thesis examiners tend to be broadly consistent in their practices and recommendations; they expect and want a thesis to pass, but first impressions are also very important. They read with academic expectations and the expectations of a normal reader. Like any reader, thesis examiners get annoyed and distracted by presentation errors, and they want to read a work that is a coherent whole. As academic readers, examiners favour a thesis with a convincing approach that engages with the literature and the findings, but they require a thesis to be publishable research. Finally, examiners give not only a final evaluation of a thesis, but also instruction and advice to improve the thesis and further publications and research. We hope that these generalisations will demystify the often secret process of assessing a thesis, and reassure, guide and encourage students as they write their theses.
Assessment & Evaluation in Higher Education | 2016
Adon Christian Michael Moskal; Sarah Stein; Clinton Golding
We know various factors can influence how teaching staff engage with student evaluation, such as institutional policies or staff beliefs. However, little research has investigated the influence of the technical processes of an evaluation system. In this article, we present a case study of the effects of changing the technical system for administering student evaluations at one New Zealand university. We develop a socio-technical model of the institutional evaluation system, and use this model to examine whether introducing an online system for ordering student feedback questionnaires and reducing processing time influenced academic staff engagement with evaluation. Survey responses, interview comments and data about ordering trends suggest the change did increase staff engagement by: (1) improving staff perceptions of evaluation and (2) increasing engaged behaviour, such as voluntarily ordering more evaluations. The outcomes of this study imply that the ‘practical implementation’ of an evaluation system is an important factor in influencing engagement with evaluation. We conclude that we can increase teacher engagement with evaluation simply by improving the ‘practical implementation’ of the evaluation system.
Higher Education Research & Development | 2013
Clinton Golding
In the field of higher education there are few places reserved for philosophical exploration, and this, I argue, can limit and distort the cartography. Higher education research tends to be framed as the empirical process of collecting and analysing qualitative or quantitative data. But this conception of research leaves little space for the philosophical, which is neither qualitative nor quantitative and neither collects nor analyses data. Even if we are open to philosophy, this conception of higher education research implies that philosophical, non-empirical, armchair research is either bad research or not research at all. So how should we frame the field of higher education? Macfarlane (2012) and Tight (2003) took a useful empirical route and described higher education research, but I want to take the more treacherous philosophical path and examine how it should be. I begin this path with the current state of higher education research (Sections 1–2) then move through how it could be more philosophical (Sections 3–6) and into how it should be more philosophical (Sections 7–9).
International Journal for Academic Development | 2014
Clinton Golding
Last year I collaborated with a group of colleagues to design alternative forms of academic development. We designed some useful alternatives based on whether we go to the staff or they come to us, whether we offer long-term programmes or shortterm workshops, and whether the topics came from us or the staff. But our creativity was curtailed, and the range of alternatives restricted, because we thought academic development was restricted to offering opportunities for development to academics. As a result, we limited the methods we could employ, just as teachers limit the teaching methods they employ if they think teaching is restricted to ‘standing in front of students and imparting knowledge.’ Just as a broader conception of teaching can also include designing assessments and selecting readings, a broader conception of academic development could also include changing promotion policies or developing leaders in teaching and learning, not to mention a range of other possibilities we could not even imagine while we were blinkered, seeing ourselves purely as providers of development. The underlying problem, I think, is that our tacit definition of academic development restricted the means we could use to achieve our goals. Like most academic developers, professional development is central to what we do (Gosling, 2009). We support academic staff (Harland & Staniforth, 2008, p. 671) and help them to ‘be better teachers,’ to ‘develop’ and to ‘change their practice’ (Fraser, 2001, p. 57). The problem arose because we defined ourselves by what we did. Although it seemed reasonable, this definition hampered us unknowingly. I do not suggest that all academic developers share our particular limited conception. I do, however, suggest that all academic developers, regardless of the title, should be wary of defining themselves in a similar, limiting way. Similar limitations will arise whenever we define academic development narrowly by its means, by the actions we take, because this will restrict the means we can employ. We need to be careful to adopt broader definitions of academic development, based on our ends – for example, improved teachers, or improved learning for students. If we define ourselves in these broader terms we can develop a larger and more innovative repertoire of means for achieving our goals, and thus will be more likely to achieve them. Gibbs (2013), for instance, lists a range of useful means that academic developers might use to accomplish the goal of educational change.