Measuring the Impact of COVID-19 Induced Campus Closure on Student Self-Regulated Learning in Physics Online Learning Modules
MMeasuring the Impact of COVID-19 Induced Campus Closure on StudentSelf-Regulated Learning in Physics Online Learning Modules
TOM ZHANG,
University of Central Florida, USA
MICHELLE TAUB,
University of Central Florida, USA
ZHONGZHOU CHEN,
University of Central Florida, USA
This paper examines the impact of COVID-19 induced campus closure on university students’ self-regulated learning behavior byanalyzing click-stream data collected from student interactions with 70 online learning modules in a university physics course. To doso, we compared the trend of six types of actions related to the three phases of self-regulated learning before and after campus closureand between two semesters. We found that campus closure changed students’ planning and goal setting strategies for completing theassignments, but didn’t have a detectable impact on the outcome or the time of completion, nor did it change students’ self-reflectionbehavior. The results suggest that most students still manage to complete assignments on time during the pandemic, while the designof online learning modules might have provided the flexibility and support for them to do so.CCS Concepts: •
Applied computing → Distance learning ; E-learning ; Physics .Additional Key Words and Phrases: self-regulated learning, online learning environments, click-stream data
ACM Reference Format:
Tom Zhang, Michelle Taub, and Zhongzhou Chen. 2020. Measuring the Impact of COVID-19 Induced Campus Closure on StudentSelf-Regulated Learning in Physics Online Learning Modules. In
Learning Analytics and Knowledge ’21, April 12–16, 2021.
ACM, NewYork, NY, USA, 16 pages. https://doi.org/10.1145/nnnnnnn.nnnnnnn
In March of 2020, the majority of higher education institutions across the United States were forced to abruptly closecampuses and shift to distance learning for the remainder of the Spring 2020 semester due to the COVID-19 pandemic.As a result, students were suddenly faced with the unusually challenging task of self-regulating their learning activitiesat home, amidst the disruptions to life brought on by the pandemic. There is widespread concern amongst instructorsand administrators regarding the potential negative impact on student learning [7, 22], but at the time this manuscriptwas written, there is little in the way of published literature which quantitatively measures the magnitude or nature ofthis impact [11].As a result of campus closure, click-stream data from online learning systems has become one of the most reliablesources of data providing information on learning activities and learning outcomes. Several recent studies have analyzedclick-stream data to investigate students’ self-regulated learning (SRL) processes [14, 15, 19]. This current paperintroduces our attempt at measuring the impact of COVID-19 induced campus closure on multiple aspects of students’SRL processes, by analyzing click-stream data collected from students enrolled in a university introductory physics
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are notmade or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for componentsof this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or toredistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].© 2020 Association for Computing Machinery.Manuscript submitted to ACM 1 a r X i v : . [ phy s i c s . e d - ph ] J a n AK ’12, April 12–16, 2021, Zhang, et al. class interacting with 70 mastery-based online learning modules (OLMs) as part of the course assignment throughoutthe Spring 2020 semester.We will base our data analysis and results interpretation efforts on the theoretical framework of SRL, which modelsstudents’ SRL processes in three cyclical phases. In the remainder of this section, we will first briefly introduce the SRLframework, present predictions of the impact of campus closure, explain the design of OLMs and OLM sequences, andestablish connections between click-stream data from the OLMs and student actions during all three phases of SRL.
According to theories of SRL [23], a student who is self-regulating is playing an active role in their learning as opposed tobeing a passive recipient of information. According to Zimmerman’s Social Cognitive Theory [24], SRL is accomplishedby engaging in three cyclical phase during learning: Forethought, Performance, and Self-Reflection. During each ofthese phases, students use different strategies to monitor and control their learning. The Forethought Phase consists ofplanning and goal setting, where the student maps out their goals for completing a task and how they are going toachieve them. These decisions are often impacted by students’ motivations (e.g., achievement goals). In the PerformancePhase, students engage in cognitive learning strategies (e.g., reading content, taking notes) and metacognitive monitoringprocesses (e.g., time management) to complete tasks. Students are thus enacting their plans and self-monitoring theirprogress towards those goals. In the Self-Reflection Phase, students evaluate their progress and understanding of thematerial being studied and assess the factors contributing to their performance (e.g., self-testing). Based on thesereflections, students can decide to adapt their behaviors for completing the current or starting subsequent tasks.These phases are interdependent and thus they do not need to occur in a sequential order, nor do they occur onlyonce during a task. For example, if an online learning module allows multiple attempts at an assessment, a student maychoose to adapt how they engage with the content prior to subsequent attempts if self-reflection deemed their initialstrategy ineffective. This implies that a student must be aware of their own cognition and performance to self-regulateefficiently.COVID-19 induced campus closure can potentially have multiple negative impacts on a student’s SRL processes byboth providing fewer opportunities and placing a higher demand for different types of cognitive, metacognitive, andadaptive processes. During the Forethought Phase, a student needs to consider a variety of extraneous factors suchas computer access in a family home when planning their study. For the Performance Phase, students face a higherbarrier for help seeking [2], while having to more actively monitor the amount of time they spend on each lessoncompared to dedicated class hours. In terms of self-reflection, students face lower accessibility for external supportsuch as exchanging notes with classmates or asking questions after class but are still required to evaluate their progressand make adjustments.
Each OLM is focused on explaining one or two basic concepts, ordeveloping the skills to solve one kind of problem, designed to be completed between 5 to 30 minutes. The OLMconsists of an assessment component (AC) which tests students’ mastery of the module topic in 1-2 questions, and aninstructional component (IC) with instructional text and practice problems on the topic (see Figure 1). Upon accessing amodule, students are shown the learning objectives of the current module and asked to make an initial attempt on theAC before being allowed to access the IC. Students can make additional attempts on the AC at any time after the firstattempt and are not required to access the IC. This design is motivated in part by the "mastery-learning" format [3, 13] mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, that allow students who are already familiar with the content to proceed, and by the concept of "preparation for futurelearning" [18] intending to improve students’ learning from the IC by exposing them to the questions first. Fig. 1. Schematic illustration of OLM design, adapted from [6].
A number of OLM modules form an OLM sequence about a more general topic (e.g., conservation of mechanicalenergy) and students are required to pass the AC or use up all attempts before moving onto the next OLM in thatsequence. A typical OLM sequence consists of 5-12 modules and are assigned as self-study homework for students tocomplete over a period of one to two weeks. In Fall 2019, a total of 44 OLMs were assigned as homework for 7 out ofthe 10 topics in a calculus-based introductory physics course, while in Spring 2020, 9 out of the 10 topics used a total of70 OLMs as online homework for the same course. In both semesters, students could earn extra credit by completingsome of the OLMs 2-6 days prior to the due date.
Based on the design of the OLMs and OLM sequences,we identify six types of student actions that can be detected or inferred from one or more patterns in click-stream data.These actions are related to or indicative of students’ behavior during each of the three phases of SRL, summarized inTable 1.
Table 1. Relating Data to SRL.
Phase Behavior Action/Decision Data Signal Data LevelForethought Planning Skipping, skimming through, orengaging with the initial at-tempt Fraction of 1 st attempts less than15s, between 15 and 35s, andlonger than 35s. ModuleGoal Setting Passing or finishing the OLMwith minimal effort. Fraction of students who adopt aLate Study or No Study strategy. ModuleCompleting the modules earlyor close to the sequence duedate. Fraction of OLMs completed atleast 1 or 3 days prior to the duedate. SequencePerformance Learning Passing the assessment afterstudying the module, or passingon a Brief Attempt. Fraction of students passing afteraccessing the IC and fraction ofBrief Passing Attempts (<35s). ModuleSelf-Reflection Reviewing Revisiting an upstream modulewhile working within an OLMsequence. The average number of revisitingevents per OLM sequence SequenceRevisiting a completed OLM be-fore a midterm exam The number of revisiting eventswithin 3 days of a midterm exam. Sequence AK ’12, April 12–16, 2021, Zhang, et al.
Regarding the
Forethought
Phase, data from the OLMs can provide information on two types of behavior: planning and goal setting . The mandatory first AC attempt on each OLM requires students to plan on whether to engage withthe problems or randomly submit a response without reading and proceeding to the IC. Previous studies [6, 12] suggestthat attempts submitted under 15 seconds are likely generated by students who skipped reading the problems in the AC,whereas attempts between 15 and 35 seconds are likely generated by students who read the problems but didn’t knowhow to solve them properly. Attempts longer than 35 seconds have a higher probability of being a genuine attempt atsolving the AC problems and are more frequently observed among high performing students. The decision to skip thefirst attempt must be made before the start of the attempt, therefore the fraction of Short (<15s) First Attempts on eachOLM provides information on students’ planning actions for each OLM.Furthermore, when a student fails the initial attempt on an OLM, they can decide to study the materials in the ICbefore attempting the AC again or to make additional attempts immediately. From an SRL perspective, a student with agoal of mastering the content will likely access the IC after 1 or 2 failed attempts on the AC, whereas a student with thegoal of completing the module with as little effort as possible are more likely to never access the IC at all (a "No Study"strategy) or access the AC after 3 or more fail attempts (a "Late Study" strategy). Preliminary data analysis suggests thatstudents who are cramming on multiple modules just prior to the due date are more likely to adopt these strategies.Measuring the popularity of Late and No Study strategies among students is used as an indicator for students’ goalsetting behavior upon accessing each individual OLM.In addition, students’ goal setting action can take place when considering completing an entire OLM sequence as alarger task. In this context, students may set a goal to complete modules early and earn extra credit, or decide to "cram"complete all or most of the modules on or close to the due date. Detailed investigation of students’ work distribution asa result of extra credit would require extensive analysis beyond the scope of the current paper (see for example [10]). Inthis paper, we present a quick estimation by measuring the number of modules completed at least 1 or 3 days prior tothe due date as indicators for students’ goal setting behaviors when an OLM sequence is viewed as a task.Regarding the
Performance
Phase, it is difficult to infer cognitive strategies adopted by students via click-streamdata alone. However, we may straightforwardly estimate the outcome of learning by measuring the percentage ofpassing AC attempts either before or after accessing the IC. In an OLM sequence, passing attempts before accessingthe IC on a later module can be a measure of learning quality of earlier modules [5]. In a previous study [6], we foundthat some fractions of students pass the AC on a Brief (<35s) Attempt, which could suggest that students guessed theanswer by chance or obtained the answer from other sources, such as a classmate. As previously explained, Short (<15s)Attempts are likely generated from students who didn’t read the problem body, and Brief Attempts are more likelygenerated from students who read the problem body. Therefore, we use the fraction of Brief and Short Attempts as anindicator for the quality of students’ learning on each OLM.
Self-reflection is mostly a metacognitive process which doesn’t often generate direct records in click-stream data.However, we have identified certain types of behaviors which may be indicative of self-reflective processes. Moststudents interact with each OLM only once and move on after passing the AC, but some will revisit a previously passedmodule while working on a downstream OLM in the sequence. Therefore, the average number of modules reviewedby one student in a given sequence is chosen as an indicator for the frequency of a self-reflective process. Moreover,self-reflection could take place when students are reviewing for an upcoming exam, and can be estimated by the numberof OLMs revisited shortly before an exam day. For the current analysis, we measure the number of modules revisited bya student up to three days before a midterm exam after campus closure as an indicator for reviewing behavior. mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, It must be emphasized that SRL is an interdependent and iterative process, thus each student action identified islikely influenced by or resulted from multiple different SRL behaviors in different phases. The reason we associate eachaction to one behavior is just to provide an organizational framework for presenting the results, as well as a baselinefor interpreting those results.
In this paper we examine the hypothesis that campus closure resulted in a significant reduction of productive SRLbehavior in the student population. More specifically, it would result in the following changes in the six data indicatorsof SRL actions after campus closure:(1) An increase in the frequency of Short 1 st Attempts, indicating a reduction in planning and self-assessment.(2) An increase in the fraction of students adopting a Late or No Study strategy, indicating a shift from mastery-oriented goals to performance-oriented goals.(3) A decrease in the number of modules completed 3 or more days prior to the due date, indicating fewer studentssetting goals involving completing the modules early.(4) A decrease in passing rate before or after studying the IC, or an increase in Short Passing Attempts, indicating areduction in learning outcomes.(5) A decrease in the number of revisiting events during each sequence or close to a midterm exam, indicating adecrease in frequency of self-reflection.We will examine and compare each type of data due before and after campus closure following the data analysisschema explained in the Methods, section (2). We will also present details about OLMs and their implementation in thephysics course as well as operational definitions of actions (e.g., passing a module). We will present the analysis in theResults, section (3), followed by discussion on the implications and possibilities of future studies.
The OLM modules are created and hosted on the Obojobo learning objects platform [4], an open source online learningplatform developed by the Center for Distributed Learning at the University of Central Florida. In the current iteration,the AC of each OLM contains 1-2 multiple choice problems and permits a total of 5 attempts. Each of the first 3 attemptsare sets of isomorphic problems assessing the same content knowledge with different surface features or numbers.On the 4 th and 5 th attempts, students are presented with the same problems in the 1 st and 2 nd attempts respectivelyand are awarded 90% credit. The IC of each module contains a variety of learning resources including text, figures,videos, and practice problems. Access to the IC is locked whenever a student is attempting the AC. Each OLM sequencecontains 3-12 OLMs, which students must complete in the order given, with completion defined as either passing theAC or using up all 5 attempts. Readers can access example OLMs in the following URL provided in [1]. In the Spring 2020 semester, 70 OLMs in 9 sequences were assigned as online homework in a calculus-based universityintroductory physics course, which was taught in a traditional lecture format before campus closure. In Fall 2019,44 of the 70 modules were assigned in the same course. The new OLMs added in Spring 2020 include sequences S1,S2 (modules 1-16), and modules added to S8 (modules 51-57) and S9 (modules 63-66). Each sequence corresponds AK ’12, April 12–16, 2021, Zhang, et al. to classroom or online instruction for 1-2 weeks with due dates concurrent with lecture instruction. All OLMs in asequence are due on the same day. In Spring 2020, the last three sequences containing 29 modules are due after campusclosure. In Fall 2019, the last 5 modules were due after Thanksgiving break.In Fall 2019, the OLM sequences accounted for 18% of total course credit, and online homework from a commercialpublisher was used for topics for which no OLM module was available. In Spring 2020, the OLM sequences accounted for36% of course credit, with no additional homework assignments. In Fall 2019, submissions after the due date received 0points, while in Spring 2020, late submissions would receive a 13% daily penalty. In addition, students in both semesterscould earn extra credits by completing some OLMs earlier than the due date, as explained in more detail in [10].In Spring 2020, 276 students were initially enrolled in the class consisting of 200 males and 76 females. 107 of thestudents were historically underrepresented minorities and a total of 263 students passed the course. In Fall 2019,289 students registered for the course consisting of 234 males and 54 females. 111 of the students were historicallyunderrepresented minorities and a total of 247 students passed the course.
We list below the operational definition of all key terms related tothe data indicators in Table 1 and section (1.2.2). Readers interested in more nuanced details of data extraction andcleaning can refer to [6]. • AC Attempt Outcome : A student passes an AC attempt by answering every question on the AC correctly. • AC Attempt Duration : The time between a student’s click on the start attempt button and submission buttonfor a given AC. • Brief and Short Attempt : We will refer to an attempt with duration of less than 15 seconds as a "Short Attempt",and an attempt with duration between 15 and 35 seconds as a "Brief Attempt". • Module Pass : A student will be considered to have passed the module if they passed the AC within 3 attempts.The distinction arises from the fact that the 4 th attempt and beyond were already seen by the student and aregiven reduced credit. • Module Fail : A student will be considered to have failed the module if they fail on all of the first 3 attempts atthe AC. • Module Complete : A student either passes the module or uses up all attempts. Time of completion is recordedas the submission time of the first passing attempt or last failed attempt. • Late or No Study : A student does not access the IC before the 3 rd attempt of the AC. Students in this categorymay either access the IC after the 3 rd attempt or not, in which case they will be considered to have adopted the"Late Study" or "No Study" strategy respectively. • Module Revisit : A student interacting with any part of the module for at least 60 seconds after initial completion.
From a data analysis perspective, the six types of actions fall into two distinct categories: module level action andsequence level action, as listed in Table 1. Module level actions are actions or decisions made on each module (e.g.,planning to skip the first attempt or not). The proportion of module level actions on each module is expected to roughlyfollow a single linear trend over the semester and to be relatively insensitive to the order of the module in a givensequence. In comparison, sequence level actions are strongly influenced by the location of the module within the mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, sequence (e.g., module completion 3 or more days before the due date) or can only be defined for each sequence (e.g.,number of students who revisited at least 2 previous upstream modules in a sequence). The disparity in the number ofmodules (70) to the number of sequences (9) led us to employ two analysis schemes.For module level actions, we first calculate the frequency of a given data indicator in each module, then constructeda linear model from each of the two semesters in the form: 𝑦 𝑖 = 𝑦 + 𝛼𝑛 𝑖 + 𝜖 𝑖 (1)where 𝑦 𝑖 is the frequency of observing the data indicator on module 𝑖 and 𝑛 𝑖 the order in which students completeeach OLM in the semester. 𝑦 is the intercept, 𝛼 the slope, and 𝜖 𝑖 the noise term which accounts for all other effects notcaptured by the linear model. When constructing the linear models, the module numbers for Spring 2020 were usedfor both years as the missing OLM sequences and OLMs were supplemented with online homework assigned fromWebAssign.In addition, data from 2020 was further divided into two segments, the OLMs that were due before and after campusclosure, A and B respectively. For comparison, the same partition is applied to the modules in 2019 even though nocampus closure took place. Six linear models of the form (1) were constructed for each of the data indicators outlined inTable 1. Table 2. Analysis Acronyms.
Acronym Type Year(s) Segments20A-B Within Semester 2019 A vs. B19A-B Within Semester 2019 A vs. B20A-19A Between Semesters 2020 vs. 2019 A vs. A20B-19B Between Semesters 2020 vs. 2019 B vs. B20-19 Between Semesters 2020 vs. 2019 Entire SemesterNext, we compared the slopes of the linear models for Segments A and B within the same semester listed as 20A-Band 19A-B in rows 1 and 2 of Table 2. We tested the homogeneity of the regression slopes using Analysis of Covariance(ANCOVA) by including the interaction of due date and module number: 𝑦 𝑖 = 𝑦 + 𝛼𝑛 𝑖 + 𝛽𝛿 𝐴𝐵 𝑛 𝑖 + 𝜖 𝑖 (2)where 𝛿 𝐴𝐵 = if module 𝑖 is in Segment A, and 𝛿 𝐴𝐵 = if the module is in Segment B. If 𝛽 is not significantly differentfrom zero (i.e., the slope is similar for the two segments), we performed a second ANCOVA of the form: 𝑦 𝑖 = 𝑦 + 𝛼𝑛 𝑖 + 𝛾𝛿 𝐴𝐵 + 𝜖 𝑖 (3)If 𝛾 is significantly different from 0, that indicates that the intercepts between the two segments are significantlydifferent.If campus closure had a significant impact on a given SRL signal (i.e., either 𝛽 or 𝛾 is significantly different from ), we isolate the effect by comparing the linear models for Segments A and B between the two semesters, listed as20A-19A and 20B-19B in rows 3 and 4 of Table 2 using the subset of modules common to both semesters. If the effect isdirectly detectable, we expect that the linear models for Segment B of Spring 2020 to be significantly different fromSegment B of Fall 2019. AK ’12, April 12–16, 2021, Zhang, et al.
If no differences were detected for the linear models of Segments A and B, we then proceeded to compare the linearmodels for the entire semester, row 5 of Table 2. If the slope or intercept was found to be different, it is likely that eitherthe student population or the instructional condition was different between the two semesters, but campus closuredidn’t have a detectable impact on the action analyzed.Analysis of data on sequence-level actions (i.e., early completion and revisiting) is more straightforward. We firstrecord the observation of the data indicator (e.g., completing a module 3 days before the due date) for every studentwho accessed all modules in a given sequence. The Friedman test is then performed to observe any differences betweenthe sequences over each semester. Each sequence can be treated as an independent category since they cover differenttopics and have different due dates. To satisfy the complete block design requirement of the Friedman test, only studentswho accessed all 9 sequences were retained in the analysis. In the case of revisiting, where we count the number ofstudents which have at least one revisiting event, Cochran’s Q test was used in lieu of the Friedman test.If statistically significant differences are detected between sequences, a post hoc analysis using pairwise exact tests[8] (early completion) or McNemar’s tests (revisiting) is conducted to determine the precise differences in frequenciesbetween the sequences. If campus closure had a significant impact on students’ SRL behavior related to the observedactions, then the observed frequency on sequences due after campus closure will be significantly different from thosedue before campus closure. Data from Fall 2019 was also analyzed and presented for revisiting actions but not for earlycompletion actions since the first 5 or 6 modules of S8 and S9 were not available, inevitably resulting in significantlyless early completion events in those sequences.For the action of revisiting before an exam, we simply recorded the number of modules revisited over a period ofthree days leading to an exam by each student and compared the distribution between the semesters via Wilcoxon test.All statistical procedures were conducted in R [17] with the tidyverse and PMCMRplus packages [16, 21].
For each figure in this section, the black vertical line separates the modules due before campus closure (Segment A), fromthose due after (Segment B). For figures representing module level data, the blue line visualizes the linear regressionmodels with the shaded areas representing the 95% confidence interval. For each table in this section, statisticallysignificant differences are emboldened and appended with asterisk markings * , ** , *** representing significance at the 𝛼 = . , . , and . levels respectively. The fraction of student module access is shown in Figure 2 with the linear models constructed for Segments A and Bfor each semester. The ANCOVA of the linear models following the analysis scheme outlined in Table 2 are shown inTable 3.
Table 3. ANCOVA statistics for overall engagement.
F (intercept)
N/A N/AThe fraction of students accessing each module remained remarkably stable over the entire 2020 semester, between85% and 100% with no significant difference detected in the slopes of the linear models. The linear model for Segment B mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, (a) Spring 2020 (b) Fall 2019 Fig. 2. Fraction of students accessing each module. (post-campus closure) has a higher intercept than that of Segment A (pre-campus closure), possibly due to modules 29and 30 having lower than average access percentage. Similarly, data from Fall 2019 had no significant difference in theslopes of the linear models between each segment, but had a higher intercept in the latter half. The regression slopes in2019 were significantly more negative than their 2020 counterparts, despite the absence of campus closure
In Figure 3, we plot the fraction of 1 st AC attempts on each module under 15 seconds as an indicatorfor students’ planning action before each OLM. The results of the comparisons between linear models are listed inTable 4. (a) Spring 2020 (b) Fall 2019
Fig. 3. Fraction of first attempts under 15 seconds.Table 4. ANCOVA statistics for Short First Attempts. AK ’12, April 12–16, 2021, Zhang, et al.
In 2020, the proportion of Short First Attempts increased significantly more rapidly in Segment B than in Segment A.This rapid shift in slope is not detected in data from Fall 2019, for which there was no significant difference observedbetween the intercepts of the regression lines in each segment. Our analysis also failed to detect significant differencesbetween the linear models for Segment B between the two semesters.In contrast, the fraction of 1 st attempts between 15 and 35 seconds showed no difference in either the slopes orintercepts between either Segments A and B of each semester, or the overall linear models for both semesters, seeFigure 4. (a) Spring 2020 (b) Fall 2019 Fig. 4. Fraction of first attempts between 15 and 35 seconds.
In Figure 5, we plot the fraction of students who adopted either a Late or No Study strategy oneach module as an indicator for students’ goal setting behavior for each module.(a) Spring 2020 (b) Fall 2019
Fig. 5. Fraction of students adopting a Late Study or No Study strategy.
In Spring 2020, the number of students adopting a Late or No Study strategy increased much faster in SegmentB when compared to Segment A, 𝐹 , = . , 𝑝 < . , while no difference in trend was detected for Fall 2019.Comparing the linear models for each segment between semesters did not show a statistically significant difference ineither the slopes or the intercepts of each model. mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, To examine students’ sequence level goals, we plot the average number of modules completed by a student at least 1or 3 days before the sequence due date, see Figure 6. Friedman’s test detected that there were significant differences inthe fractions between different sequences in both conditions ( 𝜒 ( ) = . , 𝑝 < . and 𝜒 ( ) = . , 𝑝 < . ), but post hoc analysis showed that none of the significantly different sequences were due after campus closure.(a) Spring 2020 (1 Day) (b) Spring 2020 (3 Days) Fig. 6. Average fraction of modules completed in a sequence at least 1 or 3 days before the due date.
The fraction of students passing each module before accessing the IC, as plotted in Figure 7,remained largely stable throughout Spring 2020. No significant difference was found between the regression slopesfor Segments A and B. However, the intercept of Segment A was found to be higher than that of Segment B, 𝐹 , = . , 𝑝 < . , likely caused by the slightly negative slope of the model in Segment A. In 2019, both the regressionslopes and intercepts remain not significantly different between both segments. Comparing Segment A betweensemesters, we found no significant difference for the regression slopes, but the intercept in Spring 2020 was higher thanthat of Fall 2019, 𝐹 , = . , 𝑝 < . . No difference was detected in Segment B between each semester.(a) Spring 2020 (b) Fall 2019 Fig. 7. Fraction of students who passed before study of the IC.
The fraction of students passing each module after accessing the IC is remarkably stable across both semesters, withno difference detected in all of our comparisons (see Figure 8). AK ’12, April 12–16, 2021, Zhang, et al. (a) Spring 2020 (b) Fall 2019
Fig. 8. Fraction of students who passed after study of the IC.
In contrast, the percentage of Short Passing Attempts saw a significant upward shift in Segment B of Spring2020, 𝐹 , = . , 𝑝 < . , but not in Fall 2019. No difference was found when comparing the linear models for20B-19B. No difference in intercepts was found for comparisons 20A-19A, and 20B-19B. When the criterion is loosenedto include Brief Attempts, no difference in slope nor intercept was found between the models, see Figure 10.(a) Spring 2020 (b) Fall 2019 Fig. 9. Fraction of Short Passing Attempts.
In Figure 11, we plot the fraction of students with at least 1 revisiting event as defined in section (2.3.1). Cochran’sQ test indicated that there were significant differences between the OLM sequences in each semester, 𝜒 ( ) = . , 𝑝 < . and 𝜒 ( ) = . , 𝑝 < . . Pairwise McNemar tests determined that in 2020, the revisitingfraction of S1 and S8 was significantly higher than the rest of the sequences and that of S5 was lower than all theother sequences except S4. Additionally, it was found in 2019, S4 and S5 were significantly lower than the rest of thesequences.The distribution of modules revisited by each student within a 3 day period leading to the second midterm exam isplotted in Figure 12. There does not appear to be a statistically significant difference in the distribution of the numberof modules revisited by students within 3 days of the exam according to the Wilcoxon Test (Z=8115.5, p=0.696). mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, (a) Spring 2020 (b) Fall 2019 Fig. 10. Fraction of Brief Passing Attempts. (a) Spring 2020 (b) Fall 2019
Fig. 11. Fraction of students with at least 1 revisiting event. (a) Spring 2020 (b) Fall 2019
Fig. 12. Histograms of the number of modules revisited up to three days before the second midterm exam. AK ’12, April 12–16, 2021, Zhang, et al.
Results of the current analysis indicate that some SRL actions are impacted much more by COVID-19 induced campusclosure than others; overall, the changes in SRL action that can be attributed to it were less than expected.Most notably, we saw a significant increase in trend of the fraction of Late or No Study events per module for theOLMs due after campus closure, which was not observed in Fall 2019. Similarly, an increasing trend was observed for thepercentage of Short Attempts, indicative of guessing or copying, for the OLMs due after campus closure which was againabsent in 2019. These observations suggest that after campus closure, more students are adopting performance-orientedgoals (e.g. completing the module with as little time as possible) over mastery-oriented goals (e.g. internalizing newcontent) in the Forethought Phase and executing those strategies during the Performance Phase. Notably, there were nosudden shifts in the trend of Brief Attempt submissions, which are more likely to be generated from students who readthe assessment problem before deciding to guess. Therefore, the abrupt increase in Short Attempts is more likely aresult of change in student strategies, rather than an increase in content difficulty.It must be mentioned that in both cases we did not find a significant difference comparing data from Segment Bbetween the 2020 and 2019 semesters. It could mean that the impact from campus closure was not strong enough to bedetected, but could also be caused by a lack of statistical power of the analysis resulting from the smaller number ofmodules released in Segment B of 2019.A similar increase in slope after campus closure was observed for the 1 st attempts under 15 seconds for each OLM,which is indicative of students’ planning action during the Forethought Phase. This could suggest that more studentsplanned to skip the self-assessment opportunities before starting each new module. However, a similar trend was foundfor the same modules in Fall 2019 albeit to a lesser extent. This may imply that the observed change could in part bedue to increase in content difficulty or reduction in engagement towards the end of the semester unrelated to campusclosure.We found no significant differences comparing a number of data indicators before and after campus closure: thenumber of modules completed earlier than the due date, module passing rate before and after study, the number ofmodule revisiting events. One exception found was the fraction of students revisiting sequence S8, which seems to behigher than average in Spring 2020 which is opposite to the predicted impact of campus closure.In summary, our results show that COVID-19 related campus closure and distance learning affected how studentscompleted or planned to complete each module, pushing more students towards adopting goals and strategies thatminimize time and effort required to pass each module. On the other hand, we found that there was little or no impacton overall engagement, student performance, or self-reflective processes.It is worth noting that the fraction of students accessing each module decreased more rapidly in both Segment A andB in Fall 2019 when compared to Spring 2020. Additionally, the fluctuation in module access is much greater in 2019.This difference cannot be caused by the 2020 campus closure and are more likely due to differences in instructionalconditions and student populations between the semesters, which could have a non-negligible impact on our analysis. Despite the significant disruptions caused by the COVID-19 induced campus closure, the impact on students’ SRLprocesses in online learning is less than what many have feared. Even though an increasing number of studentsadjusted their plans and strategies towards preserving time and resources after campus closure, the fraction of students mpact of COVID-19-Induced Campus Closure on Self-Regulated Learning LAK ’12, April 12–16, 2021, completing and passing the OLMs on time or early remained remarkably stable, as well as the fraction of students whorevisited a previously passed module.This could imply that college students enrolled in a first year physics course have stronger self-regulatory skillsthan we previously thought. It could also suggest that online learning may have provided students with the neededflexibility to adjust to unexpected disruptions, which is consistent with findings in [11]. Furthermore, it is possiblethat the mastery-based design of the OLM sequences could have facilitated students’ SRL processes by providingfrequent self-assessment opportunities during learning. Of course, many other factors such as difference in instructionalcondition and student populations could also have contributed to the lack of observed differences. The current paper provides a quick estimation of the impact of campus closure on students’ SRL behavior in an onlinelearning environment. Many of our decisions regarding data selection and analysis methods were prioritized for quicklydetecting trends in the data rather than creating a comprehensive model. Those choices, while sufficient for the purposesof the current study, leave much room for improvement in future studies.First, future studies could include more data such as duration of study and number of practice problems. Whilecontaining rich information about students’ learning, they are not included in this paper as their relation with qualityof learning is less straightforward. Furthermore, incorporation of data from multiple sources such as the RevisedAchievement Goal Questionnaire-Revised [9] could be used to measure aspects of students’ SRL (i.e., motivation) thatare not well reflected in click-stream data.Second, future studies should extend beyond the current linear regression models used to quickly estimate shifts indata. More sophisticated models such as linear mixture modeling or the ones used in [6] could account for a number offactors overlooked in the current study. Such factors include the difference in topical difficulty between different OLMsand OLMs sequences or differences in instructional policy choices in courses. Additionally, individual students’ shifts instudy strategies could be tracked for analysis on a finer scale.Third, the validity of comparisons of data between Fall 2019 and Spring 2020 is less than ideal, in large part due todifferences in the number of modules assigned. Future studies involving the latest data from Fall 2020, during which theentire course was taught online, could provide a better baseline for comparison.Finally, the current study paper presents a case study that includes students from one class, studying one topic,using one type of online instructional design. A highly valuable direction of future research is to compare and contrastmultiple studies involving different student populations, subject matter, and online instructional designs to obtaingeneralizeable knowledge that will guide the design of future learning environments. Directed by new insights fromsuch analysis, these systems can not only be more resilient to disruptions, but also more flexible in accommodatingtoday’s increasingly diverse student population [20].
ACKNOWLEDGMENTS
This work is supported by NSF Award No. DUE-1845436. We would like to thank UCF Center for Distributed Learningfor creating the UCF Open project and the Obojobo platform, in particular Dr. Francisca Yonekura, Ian Turgeon andZachary Berry.
REFERENCES [1] [n.d.]. https://canvas.instructure.com/courses/1726856 15
AK ’12, April 12–16, 2021, Zhang, et al. [2] Vincent Aleven, Jonathan Sewall, Octav Popescu, Michael Ringenberg, Martin van Velsen, and Sandra Demi. 2016. Embedding Intelligent TutoringSystems in MOOCs and e-Learning Platforms. In
Intelligent Tutoring Systems , Alessandro Micarelli, John Stamper, and Kitty Panourgia (Eds.).Springer International Publishing, Cham, 409–415.[3] B Bloom. 1968. Learning for Mastery. Instruction and Curriculum. Regional Education Laboratory for the Carolinas and Virginia, Topical Papersand Reprints, Number 1.
Evaluation comment
Physics Education Research Conference 2018 (PER Conference) . Washington, DC.[6] Zhongzhou Chen, Mengyu Xu, Geoffrey Garrido, and Matthew W. Guthrie. 2020. Relationship between students’ online learning behaviorand course performance: What contextual information matters?
Physical Review Physics Education Research
16, 1 (jun 2020), 010138. https://doi.org/10.1103/PhysRevPhysEducRes.16.010138[7] Emma Dorn, Bryan Hancock, Jimmy Sarakatsannis, and Ellen Viruleg. 2020. COVID-19 and student learning in the United States: The hurt couldlast a lifetime. (2020), 14 pages.[8] Rob Eisinga, Tom Heskes, Ben Pelzer, and Manfred Te Grotenhuis. 2017. Exact p-values for pairwise comparison of Friedman rank sums, withapplication to comparing classifiers.
BMC Bioinformatics
18, 1 (2017), 1–18. https://doi.org/10.1186/s12859-017-1486-2[9] Andrew J Elliot and Kou Murayama. 2008. On the measurement of achievement goals: Critique, illustration, and application.
Journal of EducationalPsychology . American Association of Physics Teachers (AAPT), Virtual Conference,143–148. https://doi.org/10.1119/perc.2020.pr.felker[11] T. Gonzalez, M. A. de la Rubia, K. P. Hincz, M. Comas-Lopez, Laia Subirats, Santi Fort, and G. M. Sacha. 2020. Influence of COVID-19 confinementon students’ performance in higher education.
PloS one
15, 10 (2020), e0239490. https://doi.org/10.1371/journal.pone.0239490[12] Matthew W. Guthrie, Tom Zhang, and Zhongzhou Chen. 2020. A tale of two guessing strategies: interpreting the time students spend solvingproblems through online log data. In
Physics Education Research Conference Proceedings . American Association of Physics Teachers (AAPT), VirtualConference, 185–190. https://doi.org/10.1119/perc.2020.pr.guthrie[13] Brianne Gutmann, Gary E. Gladding, Morten Lundsgaard, and Timothy Stelzer. [n.d.]. Mastery-style homework exercises in introductory physicscourses: Implementation matters.
Physical Review Physics Education Research ([n. d.]).[14] Qiujie Li, Rachel Baker, and Mark Warschauer. 2020. Using clickstream data to measure, understand, and support self-regulated learning in onlinecourses.
Internet and Higher Education
45 (apr 2020), 100727. https://doi.org/10.1016/j.iheduc.2020.100727[15] Jorge Maldonado-Mahauad, Mar Pérez-Sanagustín, René F. Kizilcec, Nicolás Morales, and Jorge Munoz-Gama. 2018. Mining theory-based patternsfrom Big data: Identifying self-regulated learning strategies in Massive Open Online Courses.
Computers in Human Behavior
80 (2018), 179–196.https://doi.org/10.1016/j.chb.2017.11.011[16] Thorsten Pohlert. 2020.
PMCMRplus: Calculate Pairwise Multiple Comparisons of Mean Rank Sums Extended . https://CRAN.R-project.org/package=PMCMRplus R package version 1.6.1.[17] R Core Team. 2019. R: A Language and Environment for Statistical Computing.[18] Daniel L. Schwartz and John D. Bransford. 2005. Efficiency and Innovation in Transfer. In
Transfer of Learning from a Modern MultidisciplinaryPerspective (Current Perspectives on Cognition, Learning and Instruction) , Jose P. Mestre (Ed.). IAP - Informaiton Age Publishing Inc., Charlotte, 1–51.[19] Michelle Taub, Roger Azevedo, Amanda E. Bradbury, Garrett C. Millar, and James Lester. 2018. Using sequence mining to reveal the efficiencyin scientific reasoning during STEM learning with a game-based learning environment.
Learning and Instruction
54 (2018), 93–103. https://doi.org/10.1016/j.learninstruc.2017.08.005[20] U.S Department of Education and Office of Educational Technology. 2017. Reimagining the Role of Technology in Higher Education. (2017).[21] Hadley Wickham, Mara Averick, Jennifer Bryan, Winston Chang, Lucy D’Agostino McGowan, Romain François, Garrett Grolemund, Alex Hayes,Lionel Henry, Jim Hester, Max Kuhn, Thomas Lin Pedersen, Evan Miller, Stephan Milton Bache, Kirill Müller, Jeroen Ooms, David Robinson,Dana Paige Seidel, Vitalie Spinu, Kohske Takahashi, Davis Vaughan, Claus Wilke, Kara Woo, and Hiroaki Yutani. 2019. Welcome to the tidyverse.
Journal of Open Source Software
4, 43 (2019), 1686. https://doi.org/10.21105/joss.01686[22] Bethany R. Wilcox and Michael Vignal. 2020. Understanding the student experience with emergency remote teaching. In . American Association of Physics Teachers (AAPT), Virtual Conference, 581–586. https://doi.org/10.1119/perc.2020.pr.wilcox[23] Barry J. Zimmerman. 2013. From Cognitive Modeling to Self-Regulation: A Social Cognitive Career Path.
Educational Psychologist
48, 3 (2013),135–147. https://doi.org/10.1080/00461520.2013.794676[24] Barry J. Zimmerman. 2015.