Jeremie Seanosky
Athabasca University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jeremie Seanosky.
ICSLE | 2015
David Boulanger; Jeremie Seanosky; Vive Kumar; Kinshuk; Karthikeyan Panneerselvam; Thamarai Selvi Somasundaram
A smart learning environment (SLE) is characterized by the key provision of personalized learning experiences. To approach different degrees of personalization in online learning, this paper introduces a framework called SCALE that tracks finer level learning experiences and translates them into opportunities for custom feedback. A prototype version of the SCALE system has been used in a study to track the habits of novice programmers. Growth of coding competencies of first year engineering students has been captured in a continuous manner. Students have been provided with customized feedback to optimize their learning path in programming. This paper describes key aspects of our research with the SCALE system and highlights results of the study.
ICSLE | 2015
Jeremie Seanosky; David Boulanger; Vivekanandan Suresh Kumar; Kinshuk
Educational applications, in general, treat disparate study threads as a singular entity, bundle pedagogical intervention and other student support services at a coarser level, and summatively assess final products of assessments. In this research, we propose an analytics framework where we closely monitor individual threads of study habits and assess study threads in an individual fashion to trace learning processes leading into assessment products. We developed customized intervention to target specific skills and nurture optimal study habits. The framework has been implemented in a system called SCALE (Smart Causal Analytics on LEarning). SCALE enables the tracking of students’ individual study threads towards multiple final study products. The large volume, multiple variety, and incessant flow of data classifies our work in the realms of big data analytics. We conducted a preliminary study using SCALE. The results show the ability of the system to track the evolution of competencies. We propose that explicitly supporting the development of a targeted set of competencies is one of the key tenets of Smart Learning Environments.
Archive | 2015
Vivekanandan Suresh Kumar; Kinshuk; Thamarai Selvi Somasundaram; David Boulanger; Jeremie Seanosky; Marcello F. Vilela
Learners’ attainment of academic knowledge in postsecondary institutions is predominantly expressed by summative approaches. Instead, recent advances in educational technology have hinted at a means to continuously measure learning attainment in terms of personalized learner competency, capacity, and effectiveness. Similarly, educational technology also offers guidelines to continuously measure instructional attainment in terms of instructional competency, instructional capacity, and instructional effectiveness. While accurate computational models that embody these attainments, educational and instructional, remain a distant and elusive goal, big data learning analytics approaches this goal by continuously observing study experiences and instructional practices at various levels of granularity, and by continually constructing and using models from these observations. This article offers a new perspective on learning and instructional attainments with big data analytics as the underlying framework, discusses approaches to this framework with evidences from the literature, and offers a case study that illustrates the need to pursue research directions arising from this new perspective.
Archive | 2016
David Boulanger; Jeremie Seanosky; Colin Pinnell; Jason Bell; Vivekanandan Suresh Kumar; Kinshuk
This paper introduces SCALE, a Smart Competence Analytics engine on LEarning, as a framework to implement content analysis in several learning domains and provide mechanisms to define proficiency and confidence metrics. SCALE’s ontological design plays a crucial role in centralizing and homogenizing disparate data from domain-specific parsers and ultimately from several learning domains. This paper shows how SCALE has been applied in the programming domain and reveals systematically how the work content of a student can be analyzed and converted to evidences to assess his/her proficiency in domain-specific competences and how SCALE can also analyze the student’s interaction with a learning activity and provide a confidence metric to assess his/her behavior as he/she culminates toward goal achievements.
international conference on advanced learning technologies | 2015
Kannan Govindarajan; David Boulanger; Jeremie Seanosky; Jason Bell; Colin Pinnell; Vivekanandan Suresh Kumar; Kinshuk; Thamarai Selvi Somasundaram
While accurate computational models that embody learning efficiency remain a distant and elusive goal, big data learning analytics approaches this goal by recognizing competency growth of learners, at various levels of granularity, using a combination of continuous, formative, and summative assessments. Our earlier research employed the conventional Particle Swarm Optimization (PSO) based clustering mechanism to cluster large numbers of learners based on their observed study habits and the consequent growth of subject knowledge competencies. This paper describes a Parallel Particle Swarm Optimization (PPSO) based clustering mechanism to cluster learners. Using a simulation study, performance measures of quality of clusters such as the Inter Cluster Distance, the Intra Cluster Distance, the processing time and the acceleration values are estimated and compared.
international conference on technology for education | 2014
David Boulanger; Jeremie Seanosky; Michael Baddeley; Vivekanandan Suresh Kumar; Kinshuk
Several major accidents in the oil and gas industry traced their source to deficient training resulting in serious injuries and even casualties along with extremely expensive damage to equipment and decrease in productivity. This paper presents a procedure evaluation/e-training tool called PeT to track the knowledge and confidence of trainees in emergency operating procedures. PeT was tested with two emergency procedures in an oil and gas company in Canada. A text-based knowledge test was implemented for each procedure. Each test consisted of multiple-choice questions. Answers were classified as perfectly correct, incomplete but correct, partially correct, mostly incorrect, and totally incorrect. The paper also describes the six-factor confidence model underlying the confidence computations in PeT: knowledge, reaction time, lingering, number of visits (revision), number of selections, and number of switching answers. Each confidence factor measures a specific aspect of the targeted behaviour in an emergency. The results of two experiments conducted in 2014 in an oil and gas company are also presented to show the types of analysis that PeT enables. A plan to move PeT into an interactive training environment to track the actions of operators in their work environment and translate their interaction into higher level competences is also briefly introduced.
international conference on advanced learning technologies | 2017
Jeremie Seanosky; Isabelle Guillot; David Boulanger; Rebecca Guillot; Claudia Guillot; Vivekanandan Suresh Kumar; Shawn N. Fraser; Kinshuk; Nahla Aljojo; Asmaa Munshi
Higher dropout and failure rates among computer science students in introductory programming courses tend to be a norm for many institutions. Years of evidence indicate that dropouts and failures persist in spite of advancements in pedagogy, technology, and teacher training. Most advancements have relied on summative assessments and of late formative assessments. This research explores assessments computed from real-time measures, based on observational data collected during student engagement with study and remedial activities. An experiment was conducted to measure the impact of real-time code assessment and dashboard-based feedback in the domain of Programming. Results indicate better course grades for a small percentage of students, and the need for task-level and meta-level interactions to guarantee significant and persistent academic performance and programming mastery.
Archive | 2016
Jeremie Seanosky; David Boulanger; Colin Pinnell; Jason Bell; Lino Forner; Michael Baddeley; Kinshuk; Vivekanandan Suresh Kumar
Traditionally, the quality of a course offering is measured based on learner feedback at the end of the offering. This chapter offers a method to measure the quality of a course offering—continually, formatively, and summatively—using factors such as the quality of resources used, learner motivation, learner capacity, learner competency growth, and instructor competence. These factors are represented in a Bayesian belief network (BBN) in a system called MI-IDEM. MI-IDEM receives streams of data corresponding to these factors as and when they become available, which leads to estimates of quality of the course offering based on individual factors as well as an overall quality of the offering. Continuous, formative, and summative course quality measurements are imperative to identify weaknesses in the learning process of students and to assist them when they need help. This chapter professes the need for a comprehensive measurement of course quality and ensuing initiatives to personalize and adapt course offerings. It presents two case studies of this novel approach: first, measurement of the quality of a course offering in a blended online learning environment and second, measurement of the quality of training course offering in an industry environment.
Archive | 2016
Jeremie Seanosky; Daniel Jacques; Vive Kumar; Kinshuk
In a growing world of bigdata learning analytics, tremendous quantities of data streams are collected and analyzed by various analytics solutions. These data are crucial in providing the most accurate and reliable analysis results, but at the same time they constitute a risk and challenge from a security standpoint. As fire needs fuel to burn, so do hacking attacks need data in order to be “successful”. Data is the fuel for hackers, and as we protect wood from fire sources, so do we need to protect data from hackers. Learning analytics is all about data. This paper discusses a modular, affordable security model that can be implemented in any learning analytics platform to provide total privacy of learners’ data through encryption mechanisms and security policies and principles at the network level.
Archive | 2017
David Boulanger; Jeremie Seanosky; Rebecca Guillot; Vivekanandan Suresh Kumar; Kinshuk
This paper presents a learning analytics system that has been extended to address multiple domains (writing and coding) for a breadthwise expansion. The system has also been infused with analytics solutions targeting competence, grade prediction and regulation traits, thus offering deeper insights. Our experiences in extending the breadth and depth of the analytics system have been discussed. The discussion includes elaboration on two types of sensors to track the writing and the coding experiences of students. The design of a dashboard for teachers to monitor the performance of their classrooms and advocate regulation activities is also included. The discussions lean more on the side of teachers, parents, and administrators, than on the side of students.