Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Shane Dawson is active.

Publication


Featured researches published by Shane Dawson.


Computers in Education | 2010

Mining LMS data to develop an early warning system for educators: A proof of concept

Leah P. Macfadyen; Shane Dawson

Earlier studies have suggested that higher education institutions could harness the predictive power of Learning Management System (LMS) data to develop reporting tools that identify at-risk students and allow for more timely pedagogical interventions. This paper confirms and extends this proposition by providing data from an international research project investigating which student online activities accurately predict academic achievement. Analysis of LMS tracking data from a Blackboard Vista-supported course identified 15 variables demonstrating a significant simple correlation with student final grade. Regression modelling generated a best-fit predictive model for this course which incorporates key variables such as total number of discussion messages posted, total number of mail messages sent, and total number of assessments completed and which explains more than 30% of the variation in student final grade. Logistic modelling demonstrated the predictive power of this model, which correctly identified 81% of students who achieved a failing grade. Moreover, network analysis of course discussion forums afforded insight into the development of the student learning community by identifying disconnected students, patterns of student-to-student communication, and instructor positioning within the network. This study affirms that pedagogically meaningful information can be extracted from LMS-generated student tracking data, and discusses how these findings are informing the development of a customizable dashboard-like reporting tool for educators that will extract and visualize real-time data on student engagement and likelihood of success.


learning analytics and knowledge | 2014

Current state and future trends: a citation network analysis of the learning analytics field

Shane Dawson; Dragan Gasevic; George Siemens; Srećko Joksimović

This paper provides an evaluation of the current state of the field of learning analytics through analysis of articles and citations occurring in the LAK conferences and identified special issue journals. The emerging field of learning analytics is at the intersection of numerous academic disciplines, and therefore draws on a diversity of methodologies, theories and underpinning scientific assumptions. Through citation analysis and structured mapping we aimed to identify the emergence of trends and disciplinary hierarchies that are influencing the development of the field to date. The results suggest that there is some fragmentation in the major disciplines (computer science and education) regarding conference and journal representation. The analyses also indicate that the commonly cited papers are of a more conceptual nature than empirical research reflecting the need for authors to define the learning analytics space. An evaluation of the current state of learning analytics provides numerous benefits for the development of the field, such as a guide for under-represented areas of research and to identify the disciplines that may require more strategic and targeted support and funding opportunities.


American Behavioral Scientist | 2013

Informing Pedagogical Action: Aligning Learning Analytics With Learning Design

Lori Lockyer; Elizabeth Heathcote; Shane Dawson

This article considers the developing field of learning analytics and argues that to move from small-scale practice to broad scale applicability, there is a need to establish a contextual framework that helps teachers interpret the information that analytics provides. The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action.


British Journal of Educational Technology | 2010

‘Seeing’ the learning community: An exploration of the development of a resource for monitoring online student networking

Shane Dawson

The trend to adopt more online technologies continues unabated in the higher education sector. This paper elaborates the means by which such technologies can be employed for pedagogical purposes beyond simply providing virtual spaces for bringing learners together. It shows how data about student ‘movement’ within and across a learning community can be captured and analysed for the purposes of making strategic interventions in the learning of ‘at risk’ students in particular, through the application of social network analysis to the engagement data. The study that is set out in the paper indicates that online technologies bring with them an unprecedented opportunity for educators to visualise changes in student behaviour and their learning network composition, including the interventions teachers make in those networks over time. To date, these evaluative opportunities have been beyond the reach of the everyday practitioner—they can now be integrated into every teaching and learning plan. [ABSTRACT FROM AUTHOR]


International Journal of Educational Management | 2010

Harnessing ICT Potential: The Adoption and Analysis of ICT Systems for Enhancing the Student Learning Experience

Shane Dawson; Liz Heathcote; Gary Poole

– This paper aims to examine how effective higher education institutions have been in harnessing the data capture mechanisms from their student information systems, learning management systems and communication tools for improving the student learning experience and informing practitioners of the achievement of specific learning outcomes. The paper seeks to argue that the future of analytics in higher education lies in the development of more comprehensive and integrated systems to value add to the student learning experience., – Literature regarding the trend for greater accountability in higher education is reviewed in terms of its implications for greater “user driven” direction. In addition, IT usage within higher education and contemporary usage of data captured from various higher education systems is examined and compared to common commercial applications to suggest how higher education management and teachers can gain greater understanding of the student cohort and personalise and enhance the learning experience much as commercial entities have done for their client base. A way forward for higher education is proposed., – If the multiple means that students engage with university systems are considered, it is possible to track individual activity throughout the entire student life cycle – from initial admission, through course progression and finally graduation and employment transitions. The combined data captured by various systems builds a detailed picture of the activities students, instructors, service areas and the institution as a whole undertake and can be used to improve relevance, efficiency and effectiveness in a higher education institution., – The paper outlines how academic analytics can be used to better inform institutions about their students learning support needs. The paper provides examples of IT automation that may allow for student user‐information to be translated into a personalised and semi‐automated support system for students.


International Journal of Educational Management | 2006

Learning Communities: An Untapped Sustainable Competitive Advantage for Higher Education.

Shane Dawson; Bruce M. Burnett; Mark O'Donohue

Purpose This paper demonstrates the need for the higher education sector to develop and implement scaleable, quantitative measures that evaluate community and establish organisational benchmarks in order to guide the development of future practices designed to enhance the student learning experience. Design/ methodology/ approach Literature regarding contemporary Australian higher education policy and community development is critiqued to illustrate the need for universities to adopt scaleable quantitative measures to evaluate stated strategic imperatives and establish organisational benchmarks. The integration of organisational benchmarks guides the implementation of future practices designed to enhance the student learning experience. A current active exemplar methodology is discussed to demonstrate applicability to both higher education administrators and teaching staff across the various organisation levels. Findings While universities are promoting and investing in the concept of community to enhance the student learning experience there are as yet, limited scaleable evaluative measures and performance indicators to guide practitioners. This paper proposes an effective measurement tool to benchmark current pedagogical performance standards and monitor the progress and achievement of future implemented practices designed to enhance the sense of community experienced by the student cohort. Originality/ value This paper identifies and addresses the current absence of effective scaleable evaluative measures to assess the achievement of stated strategic imperatives implemented as a consequence of reducing government financial support, increasing accountability, and increasing student expectations as result of educational consumerism.


learning analytics and knowledge | 2011

Learning designs and learning analytics

Lori Lockyer; Shane Dawson

Government and institutionally-driven reforms focused on quality teaching and learning in universities emphasize the importance of developing replicable, scalable teaching approaches that can be evaluated. In this context, learning design and learning analytics are two fields of research that may help university teachers design quality learning experiences for their students, evaluate how students are learning within that intended learning context and support personalized learning experiences for students. Learning Designs are ways of describing an educational experience such that it can be applied across a range of disciplinary contexts. Learning analytics offers new approaches to investigating the data associated with a learners experience. This paper explores the relationship between learning designs and learning analytics.


learning analytics and knowledge | 2015

Penetrating the black box of time-on-task estimation

Vitomir Kovanović; Dragan Gasevic; Shane Dawson; Srećko Joksimović; Ryan S. Baker; Marek Hatala

All forms of learning take time. There is a large body of research suggesting that the amount of time spent on learning can improve the quality of learning, as represented by academic performance. The wide-spread adoption of learning technologies such as learning management systems (LMSs), has resulted in large amounts of data about student learning being readily accessible to educational researchers. One common use of this data is to measure time that students have spent on different learning tasks (i.e., time-on-task). Given that LMS systems typically only capture times when students executed various actions, time-on-task measures are estimated based on the recorded trace data. LMS trace data has been extensively used in many studies in the field of learning analytics, yet the problem of time-on-task estimation is rarely described in detail and the consequences that it entails are not fully examined. This paper presents the results of a study that examined the effects of different time-on-task estimation methods on the results of commonly adopted analytical models. The primary goal of this paper is to raise awareness of the issue of accuracy and appropriateness surrounding time-estimation within the broader learning analytics community, and to initiate a debate about the challenges of this process. Furthermore, the paper provides an overview of time-on-task estimation methods in educational and related research fields.


learning analytics and knowledge | 2016

A conceptual framework linking learning design with learning analytics

Aneesha Bakharia; Linda Corrin; Paula de Barba; Gregor Kennedy; Dragan Gasevic; Raoul A. Mulder; David A. Williams; Shane Dawson; Lori Lockyer

In this paper we present a learning analytics conceptual framework that supports enquiry-based evaluation of learning designs. The dimensions of the proposed framework emerged from a review of existing analytics tools, the analysis of interviews with teachers, and user scenarios to understand what types of analytics would be useful in evaluating a learning activity in relation to pedagogical intent. The proposed framework incorporates various types of analytics, with the teacher playing a key role in bringing context to the analysis and making decisions on the feedback provided to students as well as the scaffolding and adaptation of the learning design. The framework consists of five dimensions: temporal analytics, tool-specific analytics, cohort dynamics, comparative analytics and contingency. Specific metrics and visualisations are defined for each dimension of the conceptual framework. Finally the development of a tool that partially implements the conceptual framework is discussed.


learning analytics and knowledge | 2016

Recipe for success: lessons learnt from using xAPI within the connected learning analytics toolkit

Aneesha Bakharia; Kirsty Kitto; Abelardo Pardo; Dragan Gasevic; Shane Dawson

An ongoing challenge for Learning Analytics research has been the scalable derivation of user interaction data from multiple technologies. The complexities associated with this challenge are increasing as educators embrace an ever growing number of social and content-related technologies. The Experience API (xAPI) alongside the development of user specific record stores has been touted as a means to address this challenge, but a number of subtle considerations must be made when using xAPI in Learning Analytics. This paper provides a general overview to the complexities and challenges of using xAPI in a general systemic analytics solution - called the Connected Learning Analytics (CLA) toolkit. The importance of design is emphasised, as is the notion of common vocabularies and xAPI Recipes. Early decisions about vocabularies and structural relationships between statements can serve to either facilitate or handicap later analytics solutions. The CLA toolkit case study provides us with a way of examining both the strengths and the weaknesses of the current xAPI specification, and we conclude with a proposal for how xAPI might be improved by using JSON-LD to formalise Recipes in a machine readable form.

Collaboration


Dive into the Shane Dawson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Negin Mirriahi

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Oleksandra Poquet

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

Aneesha Bakharia

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar

Erica McWilliam

Queensland University of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge