Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Doug Clow is active.

Publication


Featured researches published by Doug Clow.


Teaching in Higher Education | 2013

An overview of learning analytics

Doug Clow

Learning analytics, the analysis and representation of data about learners in order to improve learning, is a new lens through which teachers can understand education. It is rooted in the dramatic increase in the quantity of data about learners and linked to management approaches that focus on quantitative metrics, which are sometimes antithetical to an educational sense of teaching. However, learning analytics offers new routes for teachers to understand their students and, hence, to make effective use of their limited resources. This paper explores these issues and describes a series of examples of learning analytics to illustrate the potential. It argues that teachers can and should engage with learning analytics as a way of influencing the metrics agenda towards richer conceptions of learning and to improve their teaching.


learning analytics and knowledge | 2015

Examining engagement: analysing learner subpopulations in massive open online courses (MOOCs)

Rebecca Ferguson; Doug Clow

Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.


learning analytics and knowledge | 2011

iSpot analysed: participatory learning and reputation

Doug Clow; Elpida Makriyannis

We present an analysis of activity on iSpot, a website supporting participatory learning about wildlife through social networking. A sophisticated and novel reputation system provides feedback on the scientific expertise of users, allowing users to track their own learning and that of others, in an informal learning context. We find steeply unequal long-tail distributions of activity, characteristic of social networks, and evidence of the reputation system functioning to amplify the contribution of accredited experts. We argue that there is considerable potential to apply such a reputation system in other participatory learning contexts.


Open Learning: The Journal of Open and Distance Learning | 2010

Facing the challenge in evaluating technology use in mobile environments

Patrick McAndrew; Josie Taylor; Doug Clow

The process of developing innovative mobile approaches to informal and formal learning is challenging, not least in needing to satisfy stakeholders with diverse interests in the technology, the pedagogy and the overall system. Some approaches to evaluation may focus on examining the nature and quality of learning that occurs, while other methods may take a user‐centred approach to understand interactions with the systems. In this paper we highlight a methodology that attempts to address these two analytical issues in parallel, and to communicate the results to stakeholders. The methodology is grounded in cultural historical activity theory and is compatible with other views emerging that such evaluation can have multiple levels. The method applies task analysis to examine the conflicts that emerge when learners are interacting with technological systems in an informal learning setting. Results from a trial involving first‐aiders are used to illustrate the techniques as they were applied as part of a European project that developed a collaborative mobile learning environment. The method has been repeated in other studies and is suggested to provide a valuable tool to reflect on understanding and enable the sharing of perspectives on evaluation outcomes.


British Journal of Educational Technology | 2004

The evolutionary design of a Knowledge Network to support knowledge management and sharing for lifelong learning

Patrick McAndrew; Doug Clow; Josie Taylor; James Aczel

Knowledge Management (KM) and knowledge sharing are important factors that support lifelong learning, and enable people to continue developing throughout their careers. The concept of a Community of Practice (Wenger, 2000) is attractive in drawing together people whose work shares similar aspects, and consideration is given here to how technology can be used to develop and support such a community. In this paper, concepts from the Community of Practice literature are used to consider the development of a software environment for people working as a community in the area of lifelong learning. The intention was to design the system in an evolutionary way, using a minimal set of essential elements which would be elaborated according to user feedback. Three key design questions are considered: Who can contribute resources to such a system? What happens to existing practices? How is the community engaged? We conclude that, in lifelong learning, knowledge management supported by a software environment offers a good way to bring together communities, resources and experience, but to achieve these benefits, great care needs to be exerted in introducing the system and maintaining existing work practices.


EC-TEL | 2015

Moving Through MOOCS: Pedagogy, Learning Design and Patterns of Engagement

Rebecca Ferguson; Doug Clow; Russell Beale; Alison J. Cooper; Neil P. Morris; Siân Bayne; Amy Woodgate

Massive open online courses (MOOCs) are part of the lifelong learning experience of people worldwide. Many of these learners participate fully. However, the high levels of dropout on most of these courses are a cause for concern. Previous studies have suggested that there are patterns of engagement within MOOCs that vary according to the pedagogy employed. The current paper builds on this work and examines MOOCs from different providers that have been offered on the FutureLearn platform. A cluster analysis of these MOOCs shows that engagement patterns are related to pedagogy and course duration. Learners did not work through a three-week MOOC in the same ways that learners work through the first three weeks of an eight-week MOOC.


Archive | 2016

Research Evidence on the Use of Learning Analytics: Implications for Education Policy

Rebecca Ferguson; Andrew Brasher; Doug Clow; Adam Cooper; Garron Hillaire; Jenna Mittelmeier; Bart Rienties; Thomas Daniel Ullmann; Riina Vuorikari

Learning analytics is an emergent field of research that is growing fast. It takes advantage of the last decade of e-learning implementations in education and training as well as of research and development work in areas such as educational data mining, web analytics and statistics. In recent years, increasing numbers of digital tools for the education and training sectors have included learning analytics to some extent, and these tools are now in the early stages of adoption. This report reviews early uptake in the field, presenting five case studies and an inventory of tools, policies and practices. It also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.


learning analytics and knowledge | 2014

Setting learning analytics in context: overcoming the barriers to large-scale adoption

Rebecca Ferguson; Doug Clow; Leah P. Macfadyen; Alfred Essa; Shane Dawson; Shirley Alexander

Once learning analytics have been successfully developed and tested, the next step is to implement them at a larger scale -- across a faculty, an institution or an educational system. This introduces a new set of challenges, because education is a stable system, resistant to change. Implementing learning analytics at scale involves working with the entire technological complex that exists around technology-enhanced learning (TEL). This includes the different groups of people involved -- learners, educators, administrators and support staff -- the practices of those groups, their understandings of how teaching and learning take place, the technologies they use and the specific environments within which they operate. Each element of the TEL Complex requires explicit and careful consideration during the process of implementation, in order to avoid failure and maximise the chances of success. In order for learning analytics to be implemented successfully at scale, it is crucial to provide not only the analytics and their associated tools but also appropriate forms of support, training and community building.


learning analytics and knowledge | 2014

Data wranglers: human interpreters to help close the feedback loop

Doug Clow

Closing the feedback loop to improve learning is at the heart of good learning analytics practice. However, the quantity of data, and the range of different data sources, can make it difficult to take systematic action on that data. Previous work in the literature has emphasised the need for and value of human meaning-making in the process of interpretation of data to transform it in to actionable intelligence. This paper describes a programme of human Data Wranglers deployed at the Open University, UK, charged with making sense of a range of data sources related to learning, analysing that data in the light of their understanding of practice in individual faculties/departments, and producing reports that summarise the key points and make actionable recommendations. The evaluation of and experience in this programme of work strongly supports the value of human meaning-makers in the learning analytics process, and suggests that barriers to organisational change in this area can be mitigated by embedding learning analytics work within strategic contexts, and working at an appropriate level and granularity of analysis.


Active Learning in Higher Education | 2000

Critical thinking exercises for chemists Are they subject-specific?

John Garratt; Tina Overton; Jane Tomlinson; Doug Clow

Important thinking skills for professional chemists include ‘analysing and evaluating arguments’,‘making judgements’, ‘retrieving information’ and ‘experimenting’. A considerable literature provides evidence that these skills can be learned (and therefore taught).We have devised specific exercises to help students to develop these skills. Our exercises are grounded in chemistry and designed to be addressed by students working in groups in a classroom environment (sometimes in a computer classroom). The type of exercise and the classroom environment promote vigorous discussion which involves critical thinking and leads to effective learning. This article describes the exercises and argues that, while the specific examples are subject-specific, the approach used with all the types of exercise could be adapted to create subject-specific exercises for any discipline.

Collaboration


Dive into the Doug Clow's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge