Online Administration of Research-Based Assessments
Ben Van Dusen, Mollee Shultz, Jayson M. Nissen, Bethany R. Wilcox, N.G. Holmes, Manher Jariwala, Eleanor W. Close, Steven Pollock
aa r X i v : . [ phy s i c s . e d - ph ] A ug Online Administration of Research-Based Assessments
Ben Van Dusen, Mollee Shultz, Jayson M. Nissen, Bethany R. Wilcox, N.G. Holmes, Manher Jariwala, Eleanor W. Close, and Steven Pollock School of Education, Iowa State University, Ames, IA, 50011, USA Department of Physics, Texas State University, San Marcos, TX, 78666, USA Nissen Education Research and Design, Corvallis, Oregon, 97333, USA Department of Physics, University of Colorado Boulder, Boulder, CO 80309, USA Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY 14850, USA Department of Physics, Boston University, Boston, MA 02215, USA
The number and use of research-based assessments (RBAs) has grown significantly over the lastseveral decades. Data from RBAs can be compared against national datasets to provide instructorswith empirical evidence on the efficacy of their teaching practices. Many physics instructors, how-ever, opt not to use RBAs due to barriers such as having to use class time to administer them. Inthis article we examine how these barriers can be mitigated through online administrations of RBAs,particularly through the use of free online RBA platforms that automate administering, scoring, andanalyzing RBAs (e.g., the Learning About STEM Student Outcomes [LASSO] [1], Colorado Learn-ing Attitudes About Science Survey for Experimental Physics [E-CLASS] [2], Physics Lab Inventoryof Critical thinking [PLIC] [3], and PhysPort DataExplorer [4] platforms). We also explore theresearch into common concerns of administering RBAs online and conclude with a practical how-toguide for instructors.
Keywords: Research-Based Assessments, technology, student outcomes
I. INTRODUCTION
Research-based assessments (RBAs) are instrumentsdesigned to measure the impact of a course on studentoutcomes, such as content knowledge, attitudes, andidentities. Physics instructors and researchers have usedRBAs (e.g., the Force Concept Inventory [5]) to exam-ine the impact of courses and to inform research-basedpedagogical practices [6–19]. Unlike other course exams,the common usage of standardized RBAs across institu-tions uniquely supports instructors to compare their stu-dent outcomes over time or against multi-institutionaldatasets and also supports large-scale PER investiga-tions.Using RBAs began as an activity in which only afew instructors engaged, particularly those interested inPER. Instructors face several logistical barriers to admin-istering RBAs but, as the number and use of RBAs haveexpanded, so have resources that support instructors inusing them. In particular, online platforms are now avail-able to help instructors throughout the process from se-lecting (e.g., PhysPort [20]), administering, scoring, andanalyzing RBAs (e.g., the Learning About STEM Stu-dent Outcomes [LASSO] [1], Colorado Learning Atti-tudes About Science Survey for Experimental Physics [E-CLASS] [2], and Physics Lab Inventory of Critical think-ing [PLIC] [3] platforms). These online platforms removeor lower the barriers to using RBAs, making it easy foreven first-time users to measure their students’ outcomessystematically. As more courses transition to be offeredonline, these web-based systems are increasingly useful.We hope that this paper can serve as a guide forinstructors considering administering RBAs online. Inwhat follows, we examine common barriers to using RBAs, how online administration can ameliorate thosebarriers, and the research into online administration ofRBAs. We also include a practical how-to for administer-ing RBAs online, and, in the appendix, we include sampleemail wording for pretest and posttest administrations.
II. BARRIERS TO USING RBAS ON PAPER,AND ONLINE SOLUTIONS
Below we have listed common reasons instructors givefor choosing not to use RBAs during class and explana-tions for how online administration address these con-cerns.
I can’t spare 30+ minutes of class time twicein a semester to give an RBA.
Administering RBAsonline allows students to complete RBAs either at homeor in class. Studies have found that with sufficient in-centives, students’ participation and scores are the samewhether completed in class or at home (see the discussionin the next section).
I don’t have the time or TA power to scorean RBA.
Administering the RBA online removes thestep of scoring scantrons or paper surveys and automat-ically generates spreadsheets of student responses thatcan be quickly and easily analyzed. Online RBA plat-forms (e.g., LASSO [1], E-CLASS [2], PLIC [3], andPhysPort DataExplorer [4]) can automate the scoringprocess altogether, providing instructors full student re-sponses and scored responses.
I need an online version of the assessment andcan’t spare the time to set this up myself.
OnlineRBA platforms already host and administer a wide arrayof physics RBAs for free.
I don’t know what my results mean.
Online RBAlatforms can automatically generate reports that in-clude visualizations and summary statistics that contex-tualize student outcome data. This can help instructorsmake sense of their students’ performance and informconcrete changes to their instruction.
I don’t have access to any comparison data.
On-line RBA platforms can standardize data formats, mak-ing it easy to compare or combine course data. Theseplatforms collect course meta-data that can also be usedto identify appropriate comparison points for a widerange of courses and institutions. They can also auto-matically aggregate and anonymize datasets to supportPER in performing large-scale, multi-institution investi-gations.
III. BARRIERS TO USING RBAS ONLINE,AND RESEARCH-BASED RESPONSES
Moving an RBA online and at-home, particularly onetargeting content knowledge as opposed to attitudes andbeliefs, brings with it several potential concerns, particu-larly around issues of student engagement, test security,and use of unauthorized resources. Below, we articu-late some of these concerns and summarize research find-ings that begin to address them. Note that the findingsdiscussed here represent a snapshot of the current stateof understanding about student engagement with onlineRBAs; as the use of online tests becomes more commonand norms change, these findings may become less gen-eralizable.
Does giving the test online impact how manyand which of my students participate?
Low-stakesRBAs administered online have yielded similar participa-tion rates as the equivalent paper tests administered inclass [21]. In an experiment where researchers randomlyassigned students ( N =1310) at one institution to take thesame RBA online outside of class versus on paper insideclass, participation rates were comparable if instructorsadministered the RBAs using the recommended practicesdescribed in Sec. IV (also accessible at [22]). Moreover,the participation rates did not differ between online andin-person based on gender or final course grade [21]. In-centive structures strongly influence participation rates;for example, another study [23] found an increase in on-line participation rates compared to historical norms, at-tributed to changes in incentives (explicit credit for par-ticipation when administered online). Does taking the test online impact the score formy course and can I compare my scores from on-line versions to my scores from in-person versionsor published results?
In the first study describedabove [21], researchers found that student performanceon the online, computer-based tests were equivalent toperformance on the same tests administered on paperduring class [21]. This result held for both concept in-ventory tests and attitudinal surveys, suggesting that in-structors can compare results from online and in-person administrations. The second study described above [23]found slightly lower online scores relative to historicaldata sets. They attributed this effect to the increasedparticipation rate from lower-performing students in on-line assessments compared with in-person assessments.This result suggests a reduction in the common samplingbias toward higher-performing students and would makethe scores more representative.
What if my students use the internet to look upthe answers to the questions?
In a study examiningstudents behaviors when taking research-based assess-ments online [23], researchers found that only ∼
10% ofstudents showed direct evidence of copying question text,potentially intending to search the text to find the cor-rect answer online. For tests with solutions readily avail-able online, this behavior correlated with increased per-formance, while for tests without available solutions, itcorrelated with lower performance. However, because theproportion of students engaging in these behaviors wassmall, the impact on the overall average for the coursewas not significant. These findings align with other [21]findings about the lack of impact on performance associ-ated with administering an assessment online.
What if my students get distracted and donttake the test seriously?
Researchers have usedbrowser focus data (i.e., how often and for how long theassessment tab becomes hidden on the students screen)to determine how common distraction might be duringonline RBAs. This study [23] found that browser focusdata indicated that between half and two-thirds of stu-dents lost focus on the assessment at least once, thoughthe majority of these events (two-thirds) was less than 1min in duration. Additionally, neither the number northe duration of focus loss events correlated with studentsscores. Thus, in that study, there was no apparent neg-ative impact on students scores due to distraction in theonline environment.
What if my students save the test and post itonline?
Security of research-based assessment is an issuethat becomes particularly important when administeringthe assessments online, and, in practice, the nature ofthese concerns depends on the assessment in question.For example, well-used introductory assessments such asthe FMCE or BEMA are already available online on paidsites such as Chegg or CourseHero [23]. Less well usedor newer assessments do not appear to have worked solu-tions available online to date. In one study, very few stu-dents (less than 1-2%) attempted to save the test usingprint commands during online assessments [23]. How-ever, it is likely inevitable that questions (and solutions)will become increasingly available to students over time.This makes it all the more important that faculty keepthese assessments low-stakes, not graded, and provideappropriate instructions to motivate students to take theassessment in the intended spirit, as a learning tool (seethe next section).2
V. PRACTICAL HOW-TO
Many of the strategies for implementing RBAs on pa-per [24] also apply to implementing RBAs online, thoughthere are some unique strategies.
Provide points-based incentive:
For points-basedincentives, the point value should be small enough tokeep participation low-stakes, but large enough to mean-ingfully motivate participation. We recommend on theorder of a few percent of the total grade or the equiva-lent of a small portion of a homework assignment. Onlygive points for participation, not for correctness. Be sureto emphasize this to students through your class commu-nications and in your syllabus.
Provide multiple reminders [25]:
If students arecompleting the surveys in their own time (not duringclass), we recommend sending multiple email remindersleading up to the deadline and making multiple an-nouncements during class time. The repeat announce-ments help catch students who may have missed the firstnotification and indicate to students that you, as an in-structor, value their participation. Because participationis often higher on pretests than posttests [21], we suggestsending more notifications and re-emphasizing incentivesat posttest.
Use dedicated class time:
Some instructors maybe uncomfortable providing points-based incentives, inwhich case it may be easier to use class time. Researchhas indicated that participation rates are similar whetherthe students complete the RBAs during or outside of classtime (with appropriate incentives). While unnecessary,it is still appropriate to use time during a scheduled classsession for students to complete the surveys online. Toadminister instruments online, instructors can share thelink during the class session (such as during the first labor tutorial).
Communicate the goals:
When announcing theRBA to students, explicitly explain the goal is to ob-tain important information about the course and the in-struction to better serve the students in the class (inthe present and the future). Briefly describe the ben-efit of their participation to you, the instructor (in termsof feedback for the course), and to them, the students(in terms of study opportunities). Explicitly state thatthe goal is not to evaluate the students individually. En-courage students to answer all the questions, even if theyare not confident in their responses, so you can adaptinstruction accordingly.
Refer to the surveys through generic names:
When describing the RBA, use a generic name (such as“course survey”) rather than the official instrument nameso students can less easily search for the instrument on-line.
Avoid enforcing time limits:
Although manyRBAs have recommended time limits, placing time lim-its on the instruments themselves can increase students’test anxiety and sense of higher stakes. Not placing stricttime limits can also mediate technical difficulties studentsmay face with online administration.
Do not share the solutions, answers, or stu-dents’ scores:
This helps maintain the security of theRBA so the community can continue to use it [14]. Pro-viding scores can motivate students to want to know thesolutions to the problems. The scores themselves arelikely difficult for students to interpret. For example,pre-scores are typically quite low (in some cases post-scores as well), which could demoralize students withoutthe appropriate context. Furthermore, RBA scores areonly informative about group-level scores, not individualstudents, making individual scores less useful.
V. CONCLUSIONS
RBAs are useful measures of the impact of a courseon students’ conceptual learning or attitudes/beliefs andhave been a major driver of change in physics education.Many physics instructors, however, do not use RBAsfor a variety of reasons. Online administration of RBAscan remove many of the barriers to administering, scor-ing, and analyzing RBA results. Researchers have foundthat instructors can get similar amounts and quality ofRBA data whether they administer them in-class or on-line. Further, research to date has found minimal impactin student scores from using unauthorized resources orevidence of students compromising assessment securitywhen administered online. Instructors can administerRBAs online by setting up their own online version (e.g.,in their learning management system) or they can use ex-isting online RBA administration platforms (e.g., LASSO[1], E-CLASS [2], and PLIC [3]). These platforms arefree and offer several advantages, such as lowering setuptime, automatically generating reports, and contributingto large-scale PER investigations.
VI. REFERENCES [1] Learning Assistant Alliance,“Learning about stem student outcomes (lasso) platform,”(2020).[2] Lewandowski Group, “E-class: Colorado learning attitudes about science survey for experimental physics,”(2020). [3] Cornell Physics Education Research Lab,“Physics lab inventory of critical thinking,” (2020).[4] PhysPort, “Dataexplorer,” (2020).[5] David Hestenes, Malcolm Wells, and GreggSwackhamer, “Force concept inventory,” he Physics Teacher , 141–158 (1992).[6] Eric Mazur, “Peer instruction,” (2013).[7] Ruth W Chabay and Bruce A Sherwood, Matter and in-teractions (John Wiley & Sons, 2015).[8] Adrian Madsen, Sarah B McKagan, Eleanor CSayre, and Cassandra A Paul, “Resource letter rbai-2:Research-based assessment instruments: Beyond physicstopics,” American Journal of Physics , 350–369 (2019).[9] Adrian Madsen, Sarah B McKagan, and Eleanor CSayre, “Resource letter rbai-1: research-based assessmentinstruments in physics and astronomy,” American Jour-nal of Physics , 245–264 (2017).[10] Joshua Von Korff, Benjamin Archibeque, K AlisonGomez, Tyrel Heckendorf, Sarah B McKagan, Eleanor CSayre, Edward W Schenk, Chase Shepherd, and LaneSorell, “Secondary analysis of teaching methods in intro-ductory physics: A 50 k-student study,” American Jour-nal of physics , 969–974 (2016).[11] Catherine H Crouch and Eric Mazur, “Peer instruction:Ten years of experience and results,” American journalof physics , 970–977 (2001).[12] Bethany R Wilcox and Steven J Pollock, “Coupledmultiple-response versus free-response conceptual assess-ment: An example from upper-division physics,” Phys-ical Review Special Topics-Physics Education Research , 020124 (2014).[13] Ben Van Dusen and Jayson Nissen, “Equityin college physics student learning: a criti-cal quantitative intersectionality investigation,”Journal of Research in Science Teaching , 33–57 (2020).[14] Edward F Redish, “Teaching physics with the physicssuite,” (2004).[15] David E Meltzer and Ronald K Thornton, “Resource let-ter alip–1: active-learning instruction in physics,” Amer-ican journal of physics , 478–496 (2012).[16] National Research Council et al. , Discipline-based ed-ucation research: Understanding and improving learn-ing in undergraduate science and engineering (NationalAcademies Press, 2012).[17] Richard R Hake, “Interactive-engagement versus tra-ditional methods: A six-thousand-student survey ofmechanics test data for introductory physics courses,”American journal of Physics , 64–74 (1998).[18] Lin Ding, Ruth Chabay, and Bruce Sherwood, “Howdo students in an innovative principle-based mechanicscourse understand energy concepts?” Journal of researchin science teaching , 722–747 (2013).[19] Stephanie V. Chasteen, Bethany Wilcox, Marcos D.Caballero, Katherine K. Perkins, Steven J. Pol-lock, and Carl E. Wieman, “Educational transfor-mation in upper-division physics: The science educa-tion initiative model, outcomes, and lessons learned,”Phys. Rev. ST Phys. Educ. Res. , 020110 (2015). [20] PhysPort, “Physport,” (2020).[21] Jayson M Nissen, Manher Jariwala, Eleanor W Close,and Ben Van Dusen, “Participation and performanceon paper-and computer-based low-stakes assessments,”International Journal of STEM Education , 020145 (2019).[24] Adrian Madsen, Sarah B. McKagan, and Eleanor C.Sayre, “Best Practices for Administering Concept Inven-tories,” The Physics Teacher , 530–536 (2017).[25] We have included a sample script for an in-class or emailannouncement in the appendix. VII. APPENDIX: SAMPLE SCRIPT
Suggested script for an in-class or email announcement:
Pre-test:
You will receive emails from me with linksto different pre-test surveys (a concepts survey and a na-ture of physics survey), which are part of your first home-work assignment. We ask you to answer these pre-testquestions to give us a better idea of your understandingof physics before the class begins. We use your responsesto the survey to tailor our instruction in the course - notto evaluate you. If you are unsure about your answers,do not worry - this is useful information for us. Pleasetry your best to answer the questions without help fromany textbooks or anyone else. You will receive full credit(equal to one-half of a homework assignment) just foranswering all the questions, right or wrong. Please com-plete the two pre-tests by ...