Impact of evidence-based flipped or active-engagement non-flipped courses on student performance in introductory physics
AARTICLE
Impact of evidence-based flipped or active-engagementnon-flipped courses on student performance in introductoryphysics Nafis I. Karim, Alexandru Maries, and Chandralekha Singh
Abstract:
We describe the impact of physics education research-based pedagogical techniques in flipped and active-engagementnon-flipped courses on student performance on validated conceptual surveys. We compare student performance in courses thatmake significant use of evidence-based active engagement (EBAE) strategies with courses that primarily use lecture-based (LB)instruction. All courses had large enrollment and often had 100–200 students. The analysis of data for validated conceptualsurveys presented here includes data from large numbers of students from two-semester sequences of introductory algebra-based and calculus-based introductory physics courses. The conceptual surveys used to assess student learning in the first andsecond semester courses were the Force Concept Inventory and the Conceptual Survey of Electricity and Magnetism, respec-tively. In the research discussed here, the performance of students in EBAE courses at a particular level is compared with LBcourses in two situations: ( i ) the same instructor taught two courses, one of which was a flipped course involving EBAE methodsand the other an LB course, while the homework, recitations, and final exams were kept the same; ( ii ) student performance inall of the EBAE courses taught by different instructors was averaged and compared with LB courses of the same type also averagedover different instructors. In all cases, we find that students in courses that make significant use of active-engagement strategies,on average, outperformed students in courses using primarily LB instruction of the same type on conceptual surveys eventhough there was no statistically significant difference on the pretest before instruction. We also discuss correlation between theperformance on the validated conceptual surveys and the final exam, which typically placed a heavy weight on quantitativeproblem solving. Key words: evidence-based, active-engagement, flipped classes, just-in-time teaching, physics education research.
Résumé :
Nous décrivons l’impact des techniques pédagogiques basées sur la recherche dans la formation en physique, utilisantdes cours inversés et des cours non inversés standard d’engagement actif, sur la performance des étudiants suite a` une enquêtepour valider l’acquisition des concepts (enquête conceptuelle). Nous comparons les performances des étudiants dans des coursqui utilisent largement des stratégies d’implications actives basées sur des faits probants (SIAFP) avec des cours utilisant surtoutdes cours magistraux (CM). Tous les cours comptent un grand nombre d’étudiants, jusqu’a` 100–200 étudiants. L’analyse dedonnées des enquêtes conceptuelles présentée ici, inclut des données provenant d’un grand nombre d’étudiants ayant suivi deuxsessions de cours d’introduction a` la physique, basés sur l’algèbre et sur le calcul différentiel. Les enquêtes conceptuelles utiliséespour évaluer l’apprentissage des étudiants dans les cours de la première et de la deuxième session sont la Force ConceptInventory et l’enquête conceptuelle portant sur l’électricité et le magnétisme respectivement. Dans la recherche rapportée ici,nous comparons la performance des étudiants ayant suivi une formation SIAFP a` un niveau donné avec celle d’étudiants ayantsuivi une formation de CM dans deux situations : ( i ) le même professeur enseigne deux cours, l’un étant un cours inverséimpliquant la méthode SIAFP et l’autre étant un cours magistral, alors que les devoirs, tests et examen final sont les mêmes et( ii ) la performance des étudiants est moyennée sur tous leurs cours SIAFP donnés par des professeurs différents et comparée avecle résultats pour des CM données par des professeurs différents. Dans tous les cas, nous trouvons que les étudiants soumis a` unepédagogie active, performent en moyenne mieux que ceux qui suivent des cours magistraux, selon des enquêtes visant lesconcepts, même si des tests préformation ne montraient aucune différence statistique entre les deux groupes. Nous discutonsaussi la corrélation entre la performance dans l’enquête conceptuelle et l’examen final qui reposait beaucoup sur la solution deproblèmes quantitatifs. [Traduit par la Rédaction] Mots-clés : basé sur des faits probants, implication active, classes inversées, enseignement juste en temps, recherche en enseignement de laphysique.
Received 10 March 2017. Accepted 4 August 2017.
N.I. Karim and C. Singh.
Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260, USA.
A. Maries.
Department of Physics, University of Cincinnati, Cincinnati, OH 45221, USA.
Corresponding author:
Chandralekha Singh (email: [email protected]). This paper is part of a special issue to honor Dr. Ursula Franklin.Copyright remains with the author(s) or their institution(s). Permission for reuse (free in most cases) can be obtained from RightsLink.
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . . Introduction In the past two decades, physics education research has identi-fied the challenges that students encounter in learning physics atall levels of instruction [1–15]. Building on these investigations,researchers are developing, implementing, and evaluating evidence-based curricula and pedagogies to reduce these challenges to helpstudents develop a coherent understanding of physics concepts andenhance their problem solving, reasoning, and meta-cognitive skills[16–27]. In evidence-based curricula and pedagogies, the learninggoals and objectives, instructional design, and assessment of learn-ing are aligned with each other and there is focus on evaluatingwhether the pedagogical approaches employed have been suc-cessful in meeting the goals and enhancing student learning.One highly successful model of learning is the field-tested cogni-tive apprenticeship model [28]. According to this model, students canlearn effectively if the instructional design involves three essentialcomponents: “modeling”, “coaching and scaffolding”, and “wean-ing”. In this approach, “modeling” means that the instructionalapproaches demonstrate and exemplify the criteria for good per-formance and the skills that students should learn (e.g., how tosolve physics problems systematically). “Coaching and scaffold-ing” means that students receive appropriate guidance and sup-port as they actively engage in learning the content and skillsnecessary for good performance. “Weaning” means reducing thesupport and feedback gradually to help students develop self-reliance [28]. In traditional physics instruction, especially at thecollege level, there is often a lack of coaching and scaffolding:students come to class where the instructor lectures and doessome example problems; then students are left on their own towork through homework with little or no feedback. This is akin?>to a piano instructor demonstrating for the students how toplay the piano and then asking students to go home and prac-tice. This lack of prompt feedback and scaffolding can be det-rimental to learning.Some of the commonly used evidence-based active engagement(EBAE) approaches implemented in physics include peer instruc-tion with clickers popularized by Eric Mazur from Harvard Uni-versity [29–32], tutorial-based instruction in introductory andadvanced courses [33–35], and collaborative group problem solv-ing [36–39] (e.g., using context-rich problems [11–12]). In all ofthese evidence-based approaches, formative assessment plays acritical role in student learning [40]. Formative assessment tasksare frequent, low-stakes assessment activities that give feedbackboth to students as well as instructors about what students havelearned at a given point. Using frequent formative assessmentshelps make the learning goals of the course concrete to students,as well as provides them with a way to track their progress in thecourse with respect to these learning goals. When formative as-sessment tasks, such as concept-tests, tutorials, and collaborativegroup problem solving, are interspersed throughout the course,the distinction between teaching and learning is blurred [40–41].Moreover, technology is increasingly being exploited for peda-gogical purposes to improve student learning. For example, just-in-time teaching (JiTT) is an instructional approach in whichinstructors receive feedback from students before class and usethat feedback to tailor in-class instruction [42–44]. Typically, stu-dents complete an electronic pre-lecture assignment in whichthey give feedback to the instructor regarding any difficulties theyhave had with the assigned reading material, lecture videos, and(or) other self-paced instructional tools. The instructor then re-views student feedback before class and makes adjustments to thein-class activities. For example, Eric Mazur’s Perusall system [45]allows students to read the textbook and ask questions electron-ically and the system uses their questions to draft a “confusionreport” that distills their questions to three most common diffi- culties. Then, during class, students may engage in discussionswith the instructor and with their classmates, and the instructormay then adjust the next pre-lecture assignment based on theprogress made during class. It has been hypothesized that JiTTmay help students learn better because out-of-class activitiescause students to engage with and reflect on the parts of theinstructional material they find challenging. In particular, whenthe instructor focuses on student difficulties in lecture that werefound via electronic feedback before class, it may create a “timefor telling” [46] especially because students may be “primed tolearn” better when they come to class if they have struggled withthe material during pre-lecture activities. The JiTT approach isoften used with peer discussion and (or) collaborative group prob-lem solving interspersed with lectures in the classroom.In addition, in the last decade, the JiTT pedagogy has been ex-tended a step further with the maturing of technology [47–66] and“flipped” classes with no in-class lectures have become commonwith instructors asking students to engage with short lecture vid-eos and concept questions associated with each video outside ofthe class and using the entire class time for active engagement.The effectiveness of flipped classes in enhancing student learningcan depend on many factors including the degree to whichevidence-based pedagogies that build on students’ prior knowl-edge and actively engage them in the learning process are used,whether there is sufficient buy-in from students and the incen-tives that are used to get students engaged with the learning toolsboth inside and, equally importantly, outside the classroom.Moreover, research suggests that effective use of peer collabora-tion can enhance student learning in many instructional settings inphysics classes including in JiTT and flipped environments, and withvarious types and levels of student populations. Although thedetails of implementation vary, students can learn from eachother in many different environments. Integration of peer inter-action with lectures has been popularized in the physics commu-nity by Mazur. In Mazur’s approach [67], the instructor posesconcrete conceptual problems in the form of conceptual multiple-choice clicker questions to students throughout the lecture andstudents discuss their responses with their peers. Heller et al.showed that collaborative problem solving with peers in the contextof quantitative “context-rich” problems [11–12] can be valuable bothfor learning physics and for developing effective problem-solvingstrategies while others [68–70] have developed other instructionalstrategies designed to help students develop a coherent knowledgestructure of physics.Cognitive apprenticeship [28] is one framework that can beused to understand why the EBAE instructional strategies thattake advantage of peer discussion and collaboration may be suc-cessful in helping students learn. The EBAE pedagogies provideinstructors with an opportunity to receive feedback on commonstudent difficulties. The instructors often use this feedback toadjust their in-class activities to effectively build on students’prior knowledge, thus providing students with the necessarycoaching and scaffolding to help them learn. Peer discussion alsoprovides students with an opportunity to be coached by theirpeers, who may be able to discern their difficulties even betterthan the instructor, and carefully designed targeted feedbackfrom the instructor after the peer discussion can provide appro-priate scaffolding.
In this study, we used the Force Concept Inventory (FCI) [71] inthe first-semester courses and the Conceptual Survey of Electricityand Magnetism (CSEM) [72] in the second-semester courses to as-sess student learning. The FCI, CSEM, and other standardizedphysics surveys [71–78] have been used to assess introductory stu-dent understanding of physics concepts by a variety of educators
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . nd physics education researchers. One reason for their extensiveuse is that many of the items on the surveys have strong distractorchoices that correspond to students’ common difficulties, so stu-dents are unlikely to answer the survey questions correctly with-out having good conceptual understanding. In the researchdiscussed here, the performance of students in EBAE courses at aparticular level is compared with primarily LB courses in twosituations: ( i ) the same instructor taught two courses, one ofwhich was a flipped course involving EBAE methods and the otheran LB course, while the homework and final exams were kept thesame; ( ii ) student performance in all of the EBAE courses taught bydifferent instructors was averaged and compared with primarilyLB courses of the same type also averaged over different instruc-tors. Whenever differences between these two groups were ob-served (with students in EBAE courses performing better thanstudents in the LB courses), we investigated which students werebenefitting most from the EBAE courses (e.g., those who per-formed well or poorly on the pretest given at the beginning of thecourse). Finally, we were also interested in the typical correlationbetween the performance of students on the validated conceptualsurveys and their performance on the final exam, which typicallyplaces a heavy weight on quantitative physics problems. We compare introductory physics student performance in EBAEflipped and active-engagement non-flipped courses with LBcourses with inspiration from several theoretical frameworks.The overarching framework that is used for the instructional de-sign of all of the EBAE courses in this study (whether flipped oractive-engagement non-flipped) was the cognitive apprenticeshipmodel [28, 79, 80]. This framework focuses on providing opportu-nities to coach students and scaffold their learning. All of theEBAE classes were designed to give students similar coaching andscaffolding to develop their problem-solving and reasoning skills.The EBAE courses focused on the cognitive approach to instruc-tional design for various learning units and building on students’prior knowledge to help them learn better. For example, Piaget’sframework [81], which emphasizes “optimal mismatch” betweenwhat a student knows and where the instruction should be tar-geted for desired assimilation and accommodation of knowledgeto occur, was helpful in developing the instructional design. Arelated framework is the theory of conceptual change put forth byPosner et al. [82]. In this framework, conceptual changes or “ac-commodations” can occur when the existing conceptual under-standing of students is not sufficient for or is inconsistent withnew phenomena they are learning about. These frameworks alsosuggest that these accommodations can be very difficult for stu-dents, particularly when students are firmly committed to theirprior understanding, unless instructional design explicitly ac-counts for these difficulties. The model suggests that it is impor-tant for instructors to be knowledgeable about student ideas (e.g.,which they may apply in inappropriate contexts to make incor-rect inferences while solving physics problems). Within thisframework, students can be motivated by an anomaly that pro-vides a cognitive conflict and illustrates how their conceptions areinadequate for explaining a newly encountered physical situa-tion, so they become dissatisfied with their current understandingof concepts and improve their understanding. Taking inspirationfrom these frameworks, EBAE instructors tried to focus on stu-dent conceptions and their difficulties in learning physics to de-sign instruction that produces the desired cognitive conflict andlearning.
2. Methodology
The participants in this study were students in 16 differentalgebra-based and calculus-based introductory physics courses(more than 1500 students in first-semester courses and more than1200 students in the second-semester courses) at a typical largeresearch university in the US (University of Pittsburgh). Thecourses fall into three categories:1. A LB course is one in which the primary mode of instruction isvia lecture. In addition to the three or four weekly hours oflectures, students attended an hour-long recitation sectiontaught by a graduate TA. In recitation, the TA typically an-swered student questions (mainly about their homeworkproblems, which were mostly textbook-style quantitative prob-lems), solved problems on the board, and gave students a quizin the last 10–20 min.2. A flipped course is one in which the class was broken up into twoalmost equal size groups with each group meeting with theinstructor for half the regular class time. For example, for a200-student class scheduled to meet for four hours each week(on two different days), the instructor met with half the class(100 students) on the first day and the other half on the secondday. This was possible in the flipped classes because the totalcontact hours for each instructor each week with the studentswas the same as in the corresponding LB courses. Studentswatched the lecture videos before coming to class and an-swered some conceptual questions that were based upon thelecture video content. They uploaded the answers to thoseconceptual questions before class onto the course website andwere graded for a small percentage of their grade (typically4%–8%). Although students had to watch several videos outsideof class in preparation for each class, each video was typically5–10 min long, followed by concept questions. On average,students in a flipped class had to watch recorded videos thattook a little less than half the allotted weekly time for class(e.g., for the courses scheduled for four hours each week, stu-dents watched on average 1.5 h of videos each week, and in thecourses scheduled for three hours each week, studentswatched around 1 h of videos). These video times do not in-clude the time that students would take to rewind the video,stop and think about the concepts, and answer the conceptquestions embedded after the videos that counted for theircourse grade. In the spirit of JiTT, the instructors of the flippedcourses adjusted the in-class activities based upon student re-sponses to online concept questions, which were supposed tobe submitted the night before the class. About 90% of thestudents submitted their answers to the concept questionsthat followed the videos to the course website before comingto the class. The web-platforms used for managing, hosting,and sharing these videos and for having online discussionswith students about them asynchronously (in which studentsand the instructor participated) were Classroom Salon or Pan-opto. In-class time was used for clicker questions involvingpeer discussion and then a whole class discussion of the con-cept tests, collaborative group problem solving involvingquantitative problems in which 2–3 students worked in agroup (followed by a clicker question about the order of mag-nitude for the answer to the quantitative problem on whichstudents worked collaboratively), and lecture–demonstrationswith preceding clicker questions on the same concepts. Inaddition to the regular class times, students attended an hour-long recitation section, which was taught the same way as forstudents in the LB courses.It is important to note that the instructors who taught theflipped courses also taught LB courses at the same time (usu-ally teaching two courses in a particular semester: one flipped
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI)
Karim et al. 3Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Karim et al. 3Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . nd one LB). Students in both flipped and LB courses com-pleted the same homework and took the same final exam. Forthe calculus-based flipped courses, the students also took thesame midterm exams. This was not possible for the algebra-based courses because the exams were scheduled at differenttimes. However, in the algebra-based courses they took thesame final exam and had the same homework. Additionally,the instructors attempted to make the actual delivery of con-tent (done via videos in the flipped courses and via in-classlecture in the LB courses) very similar. Essentially, the contentof the videos was delivered in-class in the LB courses.3. EBAE interactive non-flipped course. In this course, the in-structor combined lectures with research-based pedagogiesincluding clicker questions with peer discussion, conceptualtutorials, collaborative group problem solving, and lecturedemonstrations with preceding clicker questions on the sameconcepts, similar to the flipped courses. In addition, studentsattended a reformed recitation, which primarily used context-rich problems to get students to engage in group problemsolving or worked on research-based tutorials while beingguided by a TA. The instructor ensured that the problemsstudents solved each week in the recitation activities wereclosely related to what happened in class. Students alsoworked on some research-based tutorials during class in smallgroups, but if they did not finish them in the allotted time,they were asked to complete them as homework. The materials used in this study are the conceptual FCI andCSEM multiple-choice (five choices for each question) standard-ized surveys, which were administered in the first week of classesbefore instruction in relevant concepts (pretest) and then afterinstruction in relevant concepts (post-test). Apart from the data onthese surveys that the researchers collected from all of thesecourses, each instructor administered their own final exam,which was mostly quantitative (60%–90% of the questions werequantitative, although some instructors had either the entire finalexam or part of it in a multiple-choice format with five options foreach question to make grading easier). Ten course instructors(who also provided the FCI or CSEM data from their classes) pro-vided their students’ final exam scores and most of them alsoprovided a copy of their final exam.
Our main goal in this investigation was to compare the averageperformance of students in introductory physics courses thatused EBAE pedagogies with the average performance of studentsin LB courses by using standardized conceptual surveys, the FCI(for physics I) and CSEM (for physics II) as pre- and post-tests. Wenot only calculated the average gain (post-test – pretest scores) foreach group, but also calculated the average normalized gain, whichis commonly used to determine how much the students learnedfrom pretest to post-test, taking into account their initial scores onthe pretest. It is defined as (cid:2) g (cid:3) = (% (cid:2) S f (cid:3) − % (cid:2) S i (cid:3) )/(100 − % (cid:2) S i (cid:3) ), in which (cid:2) S f (cid:3) and (cid:2) S i (cid:3) are the final (post) and initial (pre) class averages, respec-tively. Then, Normg = 100 (cid:2) g (cid:3) in percent [16]. This normalized gainprovides valuable information about how much students havelearned by taking into account what they already know based onthe pretest. We wanted to investigate whether the normalizedgain is higher in one course compared to another.To compare EBAE courses with LB courses, we performed t -tests[83] on FCI or CSEM pre- and post-test data. We also calculated theeffect size in the form of Cohen’s d defined as ( (cid:2) − (cid:2) )/ (cid:3) pooled ,where (cid:2) and (cid:2) are the averages of the two groups being com-pared (e.g., EBAE versus LB) and (cid:3) pooled (cid:4) (cid:2) (cid:3) (cid:3) (cid:5) (cid:3) (cid:4) /2 (here (cid:3) and (cid:3) are the standard deviations of the two groups being compared). Moreover, although we did not have control over the type offinal exam each instructor used in their courses, we wanted tolook for correlation between the FCI or CSEM post-test perfor-mance and the final exam performance for different instructorsin the algebra-based and calculus-based EBAE or LB courses. In-cluding both the algebra-based and calculus-based courses, 10 in-structors provided the final exam scores for their classes. We usedthese data to obtain linear regression plots between the post-testand the final exam performance for each instructor and com-puted the correlation coefficient between the performance of stu-dents on the validated conceptual surveys and their performanceon the final exam for different instructors. These correlation co-efficients between the conceptual surveys and the final exam(with strong focus on quantitative problem solving) can providean indication of the strength of the correlation between concep-tual and quantitative problem solving in introductory physicscourses.Out of all introductory physics courses (algebra-based or calculus-based physics I or II) included in this study, there were four EBAEcourses: two completely flipped classes in algebra-based introduc-tory physics I and one completely flipped and one interactive EBAEclass in calculus-based introductory physics II.
3. Results
Table 1 shows the intra-group pre- and post-test data (pooleddata for the same type of courses) on the FCI survey for thecalculus-based and algebra-based physics I courses. For the algebra-based courses, some were EBAE courses while others were LBcourses, whereas all the calculus-based courses were LB. We findstatistically significant improvements from the pretest to thepost-test for each group, but the normalized gain (Normg) is larg-est (30%) for the EBAE courses.Table 2 shows the intra-group (pooled data for the same type ofcourses) pre- and post-test data on the CSEM survey for algebra-based and calculus-based introductory physics II courses. We findthat there are statistically significant differences between the pre-and post-test scores for each group but the normalized gain(Normg) is largest (36%) for the EBAE courses.Table 3 shows the inter-group FCI pre- and post-test score com-parison between algebra-based LB and EBAE courses, first holdingthe instructor fixed (same instructor taught both the LB and EBAEcourses, used the same homework and final exams) and second,combining all instructors who used similar methods in the samegroup (only one instructor used EBAE methods, but several whotaught LB courses were combined). Table 3 shows that there is nostatistically significant difference between the pretest scores ofstudents in the LB and EBAE courses in introductory physics I onthe FCI. Table 3 also shows that the effect sizes for comparing FCIpost-test performance of students in EBAE courses with studentsin LB courses are 0.314 (same instructor teaching both courses
Table 1.
Intra-group FCI pre- and post-test averages (mean) and standarddeviations (SD) for first-semester introductory physics in calculus-basedLB courses, and algebra-based flipped and LB courses.Type of class FCI test N Mean SD p-value NormgCalculus LB Pre 461 51% 21% <0.001 25%Post 350 63% 20%Algebra flipped Pre 299 35% 18% <0.001 30%Post 262 54% 20%Algebra LB Pre 837 35% 17% <0.001 23%Post 738 50% 19%
Note:
The number of students in each group, N , is shown. For each group, ap-value obtained using a t -test shows that the difference between the pre- andpost-tests is statistically significant and the normalized gain (Normg) from pre-to post-test shows how much students learned that they did not already knowbased on the pretest. Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
The number of students in each group, N , is shown. For each group, ap-value obtained using a t -test shows that the difference between the pre- andpost-tests is statistically significant and the normalized gain (Normg) from pre-to post-test shows how much students learned that they did not already knowbased on the pretest. Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . n the same semester) and 0.233 when different courses usingsimilar methods are combined (which are considered smalleffect sizes).Table 4 shows the inter-group CSEM pre- and post-test scorecomparison between calculus-based LB and EBAE courses, firstholding the instructor fixed (same instructor taught both the LBand EBAE courses and used the same homework and final exams)and second, combining all instructors who taught using similarmethods in the same group. Table 4 shows that there is no statis-tically significant difference between the pretest scores of stu-dents in the LB and EBAE courses in introductory physics II on theCSEM. Table 4 also shows that the effect sizes for comparing CSEMpost-test performance of students in EBAE courses with studentsin LB courses are 0.357 (same instructor teaching both courses)and 0.494 when different courses using similar methods are com-bined (which are considered medium effect sizes).Table 5 shows the average FCI pre- and post-test scores foralgebra-based and CSEM pre- and post-test scores for calculus-based courses (Av-pre and -post), gain (post – pre), normalized gain(Normg), and final exam scores (Av-fin) for students in the flippedand LB courses taught by the same instructor (with the samehomework and final exam) with students divided into three groupsbased on their pretest scores. A closer look at the gains and nor-malized gains for the courses taught by the same instructor showsthat students in all of the three pretest score categories in theflipped courses had higher gains and normalized gains than thosein the LB courses taught by the same instructor. Moreover, foralgebra-based physics I, the average final exam scores of the stu-dents in the flipped course taught by the same instructor in allthree pretest categories are somewhat higher than the LB course. Table 6 shows the average FCI pre- and post-test scores foralgebra-based and calculus-based courses (Av-pre and -post), gain(post – pre), and normalized gain (Normg) for students in theflipped and LB courses with students divided into three groupsbased on their pretest scores. All equivalent (algebra-based orcalculus-based physics I) courses that used the same instructionalstrategy (flipped or LB) were combined and students were dividedinto three groups based upon their pretest scores. A closer look atthe gains and normalized gains for the algebra-based courses (forwhich there are both flipped and LB groups) shows that studentsin all of the three pretest score categories in the flipped courseshad higher gains and normalized gains than those in the tradi-tional courses. In the calculus-based LB courses, the highest thirdof the students had 83% and 82% as their FCI pretest and post-testscores, respectively. In Table 6, we do not list the average finalexam performance because instructors used different exams thatvaried in difficulty.Table 7 shows the average CSEM pre- and post-test scores foralgebra-based and calculus-based courses (Av-pre and -post), gain(post – pre), and normalized gain (Normg) for students in the EBAEand LB courses with students divided into three groups based ontheir pretest scores. All equivalent (algebra-based or calculus- Table 2.
Intra-group CSEM pre- and post-test averages (mean) andstandard deviations (SD) for second-semester introductory physics incalculus-based LB and EBAE courses (here, EBAE flipped and interac-tive non-flipped courses are combined) and algebra-based LB courses.Type of class CSEM test N Mean SD p-value NormgCalculus LB Pre 410 38% 14% <0.001 21%Post 346 51% 17%Calculus EBAE Pre 346 37% 16% <0.001 36%Post 300 60% 19%Algebra LB Pre 514 24% 11% <0.001 25%Post 449 43% 17%
Note:
The total number of students in each group, N , is shown. For eachgroup, a p-value obtained using a t -test shows that the difference between thepre- and post-tests is statistically significant and the normalized gain (Normg)from pretest to post-test shows how much students learned that they did notalready know based on the pretest. Table 3.
Inter-group comparison of the average FCI pre- and post-testscores of algebra-based students in LB courses with EBAE courseswhen both courses are taught by the same instructor and differentinstructors using similar instructional methods are combined.FCI test Group N Mean SD p-value Effect sizeSame instructorPre LB 466 35% 17% 0.831 0.017EBAE 262 35% 18%Post LB 433 48% 20% <0.001 0.314EBAE 262 54% 20%Different instructors combinedPre LB 837 35% 17% 0.901 0.009EBAE 299 35% 18%Post LB 738 50% 19% 0.001 0.233EBAE 262 54% 20%
Note:
The p-values and effect sizes are obtained when comparing the LB andEBAE courses in terms of students’ FCI scores.
Table 4.
Inter-group comparison of the average CSEM pre- and post-test scores of calculus-based students in LB courses with EBAE courseswhen both courses are taught by the same instructor and differentinstructors using similar instructional methods are combined.CSEM test Group N Mean SD p-value Effect sizeSame instructorPre LB 178 40% 13% 0.895 0.013EBAE 208 40% 15%Post LB 154 48% 15% 0.001 0.357EBAE 181 54% 19%Different instructors pooledPre LB 410 38% 14% 0.886 0.011EBAE 346 37% 16%Post LB 346 51% 17% <0.001 0.494EBAE 300 60% 19%
Note:
The p-values and effect sizes are obtained when comparing the LB andEBAE courses in terms of students’ CSEM scores.
Table 5.
Average FCI pre- and post-test scores for algebra-based andCSEM pre- and post-test scores for calculus-based courses (Av-pre and -post),gain (post – pre), normalized gain (Normg), and final exam scores (Av-fin)for students in the flipped and LB courses taught by the same instructor(with same homework and final exam) with students divided into threegroups based on their pretest scores as shown.Group Pretest split Av-pre Av-post Gain Normg Av-finFCI algebra (instructor 1)LB Bottom 1/3 18 36 18 22 48Middle 1/3 32 45 13 20 54Top 1/3 54 66 12 27 65Flipped Bottom 1/3 17 41 24 29 54Middle 1/3 32 49 17 25 54Top 1/3 56 74 18 40 65CSEM calculus (instructor 2)LB Bottom 1/3 26 35 9 12 43Middle 1/3 39 46 8 12 53Top 1/3 53 60 6 14 59Flipped Bottom 1/3 25 42 18 24 51Middle 1/3 39 49 11 18 56Top 1/3 58 70 12 29 69
Note:
Students in the LB or flipped courses with the FCI test for the algebra-based course can be compared with each other and those with the CSEM test forthe calculus-based course can be compared with each other.
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI)
Karim et al. 5Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Karim et al. 5Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . ased physics II) courses that used the same instructional strategy(EBAE or LB) were combined and students were divided into threegroups based upon their pre-test scores. A closer look at the gainsand normalized gains for the calculus-based courses (for whichthere are both EBAE and LB groups) shows that students in all ofthe three pretest score categories in the EBAE courses had highergains and normalized gains than those in the traditional courses.We should note that differences in normalized gain should beinterpreted carefully because we do not have a measure of thevariability of normalized gain, and thus, differences of 5% may ormay not be significant. We stress that we are not making state-ments about significant differences between EBAE courses and LBcourses based on normalized gain, and any statements we havemade about significant differences are supported by statisticalanalyses (e.g., Cohen’s d, or comparison of post-test results).Figure 1 shows the CSEM post-test performance along with thefinal exam performance for three different instructors in flippedand LB calculus-based courses (one instructor taught an EBAE andan LB course, two instructors taught LB courses). Fig. 1 shows thatthe linear regressions [83] for the flipped and LB courses are fairlysimilar and that there is a moderate correlation between CSEMpost-test scores and final exam scores. We also plotted linear re-gressions for the algebra-based courses, but the data look similarto Fig. 1 and so are not included here. Instead, we include all thecorrelation coefficients (CSEM posttest versus final exam) for allthe courses for which we managed to obtain post-test data. Table 8summarizes the correlation coefficients between post-CSEM or -FCI test and final exam scores for each instructor who providedfinal exam data.
4. Discussion and summary
In all cases investigated, we find that on average, introductoryphysics students in the courses that made significant use of EBAEmethods outperformed students in courses primarily taught us-ing LB instruction on standardized conceptual surveys (FCI orCSEM) in the post-test even though there was no statistically sig-nificant difference on the pretest. This was true both in thealgebra-based and calculus-based physics I (primarily mechanics)and II (primarily E&M) courses. Also, the differences betweenEBAE and LB courses were observed both among students whoperformed well on the FCI or CSEM pretest (given in the first weekof classes) and also those who performed poorly, thus indicatingthat EBAE instructional strategies help students at all levels.On the other hand, the typical effect size for the differencesbetween equivalent EBAE and traditional courses is between 0.23–0.49, which is considered small to medium. Thus, the benefits ofthese EBAE approaches were not as large as one may expect toobserve. Why might that be the case and how can instructorsenhance student learning more than that observed in this inves-tigation using EBAE instructional strategies?There are many potential challenges to using EBAE instruc-tional strategies. We list some of the possible challenges and somestrategies that may reduce those challenges. Many of these havebeen described elsewhere [68, 84–96] so we provide only a shortsummary: • Lack of student engagement even with well-designed learningtools, which may occur for many different reasons: lack of studentmotivation, poor self-efficacy or poor time-management skills onthe part of the students, lack of effective incentives for studentsto engage with the self-paced learning tools, etc. Strategies toaddress some of these difficulties have been described (e.g.,providing students with effective strategies to learn [97, 98],using certain communication activities to foster student moti-vation [84, 85]). Other strategies to address these potential is-sues have also been described [87–89, 91–93]. • Lack of student engagement with in-class active learning activ-ities (e.g., group problem solving). Many strategies to help ad-dress this issue have been described (e.g., designing in-classactivities that foster both individual accountability and posi-tive interdependence [11–12]). One example of fostering individ-ual accountability is to include a short quiz or clicker questionsrelated to content students were supposed to learn when work-ing in groups, and positive interdependence means that thesuccess of each student in a group is dependent on the successof others. For other strategies to help foster student engage-ment see refs. 84, 94, 95, and references therein. • Student misconceptions about learning, or resistance to EBAEinstructional strategies, which could be addressed at the begin-ning of the term by framing the instructional design of theclass [68, 96] and providing data on the effectiveness of theevidence-based strategies being used (and conversely the inef-fectiveness of, e.g., instructor explanations [99]). • Large class sizes can be an impediment, and one approach fac-ulty have used in flipped courses is to split the class in two, thusforming smaller class sizes as well as more room for students toform groups and move around the classroom. Undergraduateor graduate teaching assistants can also help in facilitatingin-class activities. In group activities, students often work atdifferent rates, and students who finish early can help others. • Content coverage. There is often a lot of content covered inintroductory physics courses and it may be challenging to coverthe same amount of content while also including frequent ac-tive learning activities during class. Moving some of the con-tent delivery outside of class (e.g., some pre-lecture reading or
Table 6.
Average FCI pre- and post-test scores (Av-pre and -post), gain(post – pre), and normalized gain (Normg) for students in the flippedand LB algebra-based and calculus-based courses.Group Pretest split Av-pre Av-post Gain NormgFCI calculusLB Bottom 1/3 31 46 15 22Middle 1/3 55 68 13 28Top 1/3 83 82 −1 −7FCI algebraFlipped Bottom 1/3 17 41 24 29Middle 1/3 32 49 17 25Top 1/3 56 74 18 40LB Bottom 1/3 19 35 16 19Middle 1/3 33 46 14 20Top 1/3 55 68 14 30
Note:
All courses in the same group were combined with students dividedinto three groups based upon their pretest scores as shown. Students in the LB orflipped algebra-based courses can be compared with each other.
Table 7.
Average CSEM pre- and post-test scores (Av-pre and -post), gain(post – pre), and normalized gain (Normg) for calculus-based students inthe EBAE and LB courses and algebra-based students in LB courses.Group Pretest split Av-pre Av-post Gain NormgCalculus CSEMEBAE Bottom 1/3 22 51 29 37Middle 1/3 35 57 22 34Top 1/3 56 70 15 33LB Bottom 1/3 23 39 16 20Middle 1/3 35 47 12 19Top 1/3 51 59 8 16Algebra CSEMLB Bottom 1/3 15 36 22 25Middle 1/3 23 44 21 27Top 1/3 35 51 16 25
Note:
All courses in the same group were combined with students dividedinto three groups based upon their pretest scores as shown. Students in the LB orflipped calculus-based courses can be compared with each other.
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . ideos on certain “easier” concepts, or moving the entire con-tent delivery outside of class like in a flipped course) can helpprovide additional time for in-class activities.We note that the instructors who taught the EBAE courses havecontrol of designing the courses themselves and the researchersonly provided them with guidance before and during on the in-structors designing the courses. The instructors may not haveaddressed some of the potential issues mentioned here suffi-ciently (e.g., by framing the courses at the beginning of the term,and providing incentives for students to engage both with in-classand out-of-class activities). However, these issues are challengingto fully address, especially in large classes, such as those involvedin this investigation (as suggested by the data), and iterative re- finement of a course is needed to address them. Lastly, while weprovided the instructors with information about active learningmaterials developed by physics education researchers, discus-sions indicated that they adapted or created some of their ownmaterials to fit the way they preferred to teach, and the extent towhich the materials they adapted or created are conducive toeffective learning is unclear.In addition, Henderson and Dancy [100] found that many in-structors try certain EBAE instructional strategies, but some dis-continue use after one or two semesters. The faculty memberswho persist are usually the ones who get support from their peers(e.g., developing faculty learning communities, working with in-structional designers at local teaching and learning centers) be-cause there may be many implementation difficulties specific to aparticular university even if a particular EBAE approach has beenfound to be effective elsewhere. Interacting with others, evenfrom different departments, who have been engaged in evidence-based teaching (e.g., visiting their classes, getting feedback fromthem about one’s own classes, etc.) can be extremely valuable.Often, teaching and learning centers are happy to send someoneto observe a class and provide feedback as well as suggestions forfuture active learning activities.Furthermore, we note that in this study, we found that studentperformance in a non-flipped EBAE course (which used activelearning interspersed with short lectures in class) was comparableto student performance in a flipped course. It is important topoint out that the instructor in this EBAE course ensured that therecitations are effectively used to promote active learning andthat the activities used in the recitation were closely tied to thecourse learning goals. Flipping a course can be a time-consumingprocess especially if the instructor is developing their own lecturevideos for the first time and they have not already implementedEBAE strategies in their class. Therefore, it is encouraging to ob-serve that one does not need to flip their course completely, butcan introduce EBAE activities in regular class and also in recita-tion. These active learning activities and materials can be modi-fied and improved after each use by getting feedback from the Fig. 1.
Linear regression of the CSEM post-test scores (conceptual) and final exam scores (heavy focus on quantitative problems) for fourcalculus-based introductory physics courses shows the correlation coefficients between 0.438 and 0.598. There were no clear trends in thecorrelation coefficients based upon whether the instructor (Inst) used EBAE strategies or whether the class was LB. [Colour online.]
Table 8.
Correlation coefficients ( R )between post-CSEM or -FCI and finalexam scores for each instructor whoprovided final exam data.Instructor Course type R Physics I (calculus)1 LB 0.4952 LB 0.5893 LB 0.787Physics I (algebra)1 Flipped 0.559LB 0.5162 LB 0.693Physics II (calculus)1 EBAE 0.6962 Flipped 0.488LB 0.5373 LB 0.483
Note:
The final exam data were not pro-vided by physics II instructors in algebra-based courses.
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI)
Karim et al. 7Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Karim et al. 7Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy . tudents (and also getting feedback from the TAs teaching recita-tion).As discussed earlier, learning gains in EBAE courses were not ashigh as one might expect. This should not be taken as discourage-ment, but rather as an indication that effective teaching is aniterative pursuit and one should learn from each course imple-mentation and try to improve. The expectation that introducing alot of EBAE instructional strategies that have been found to beeffective elsewhere will result in large gains without refining thematerial and implementation can deter instructors from continu-ing the use of EBAE instructional strategies when the results areless than expected, especially given the time commitment re-formed teaching can take initially. Instead, one should continueto make refinements and remember that any improvement instudent learning is worth the effort!In summary, to enhance student learning in EBAE classes itimportant not only to develop effective EBAE learning tools andpedagogies commensurate with students’ prior knowledge, butalso to investigate how to implement them appropriately andhow to motivate and incentivize their usage to get buy-in fromstudents to engage with them as intended. Furthermore, forflipped classes, it is especially important to investigate strategiesfor having a diverse group of students engage with self-pacedlearning tools effectively. Investigation of various factors that candeter or incentivize their use is essential to develop a holisticlearning environment to help students with diverse backgroundsbenefit from the self-paced learning tools. Additionally, it will bevaluable to examine and compare the effectiveness of self-pacedlearning tools (e.g., videos and concept questions provided to stu-dents in flipped classes, when implemented in a controlled envi-ronment in which students must effectively engage with the toolone-on-one in front of a researcher versus an environment inwhich students are free to use the tool in whatever manner theychoose. A framework for understanding and optimizing the fac-tors that can support or hinder effective use of self-paced learningtools (e.g., those students are asked to engage with in flippedcourses) would be helpful in developing and implementing self-paced tools conducive to learning. Acknowledgements
We are grateful to all the faculty and students who helped withthe study. We thank the US National Science Foundation foraward DUE-1524575.
References
1. A. Noack, T. Antimirova, and M. Milner-Bolotin. Can. J. Phys. , 1269 (2009).doi:10.1139/P09-108.2. H. Eshach. Can. J. Phys. , 1 (2014). doi:10.1139/cjp-2013-0369.3. E.M. Kennedy and J.R. de Bruyn. Can. J. Phys. , 1155 (2011). doi:10.1139/p11-113.4. A.C.K. Leung, A. Terrana, and S. Jerzak. Can. J. Phys. , 913 (2016). doi:10.1139/cjp-2016-0358.5. A. Terrana, A.C.K. Leung, and S. Jerzak. Can. J. Phys. , 201 (2017). doi:10.1139/cjp-2016-0779.6. H. Eshach and I. Kukliansky. Can. J. Phys. , 1205 (2016). doi:10.1139/cjp-2016-0308.7. L.C. McDermott. Am. J. Phys. , 301 (1991). doi:10.1119/1.16539.8. E. Kim and S. Pak. Am. J. Phys. , 759 (2002). doi:10.1119/1.1484151.9. J.M. Fraser, A.L Timan, K. Miller, J.E. Dowd, L. Tucker, and E. Mazur. Rep. Prog.Phys, , 032401 (2014). doi:10.1088/0034-4885/77/3/032401. PMID:24595011.10. J.L. Docktor and J.P. Mestre. Phys. Rev. ST Phys. Educ. Res. , 020119 (2014).doi:10.1103/PhysRevSTPER.10.020119.11. P. Heller, R. Keith, and S. Anderson. Am. J. Phys. , 627 (1992). doi:10.1119/1.17117.12. P. Heller and M. Hollabaugh. Am. J. Phys. , 637 (1992). doi:10.1119/1.17118.13. L. DesLauriers, E. Schelew, and C. Wieman. Science, , 862 (2011). doi:10.1126/science.1201783. PMID:21566198.14. D. Hammer. Am. J. Phys. , S52 (2000). doi:10.1119/1.19520.15. E.F. Redish, J. Saul, and R.N. Steinberg. Am. J. Phys. , 212 (1998). doi:10.1119/1.18847.16. R.R. Hake. Am. J. Phys. , 64 (1998). doi:10.1119/1.18809.17. C. Singh. The Phys. Teach. , 568 (2014). doi:10.1119/1.4902211.18. C. Singh. Am. J. Phys. , 1103 (2002). doi:10.1119/1.1512659.19. F. Reif. Am. J. Phys. , 17 (1995). doi:10.1119/1.17764. 20. B. Eylon and F. Reif. Cognition Instruct. , 5 (1984). doi:10.1207/s1532690xci0101_2.21. S.Y. Lin and C. Singh. Phys. Rev. ST Phys. Educ. Res. , 020105 (2015).doi:10.1103/PhysRevSTPER.11.020105.22. S.Y. Lin and C. Singh. Phys. Rev. ST Phys. Educ. Res. , 020114 (2013). doi:10.1103/PhysRevSTPER.9.020114.23. S.Y. Lin and C. Singh. Phys. Rev. ST Phys. Educ. Res. , 020104 (2011). doi:10.1103/PhysRevSTPER.7.020104.24. A.J. Mason and C. Singh. Phys. Rev. ST Phys. Educ. Res. , 020110 (2011).doi:10.1103/PhysRevSTPER.7.020110.25. C. Singh. Am. J. Phys. , 73 (2009). doi:10.1119/1.2990668; S.Y. Lin andC. Singh. Eur. J. Phys. , 57 (2010). doi:10.1088/0143-0807/31/1/006.26. C. Singh. Phys. Rev. ST Phys. Educ. Res. , 010104 (2008). doi:10.1103/PhysRevSTPER.4.010104.27. C. Singh. Phys. Rev. ST Phys. Educ. Res. , 010105 (2008). doi:10.1103/PhysRevSTPER.4.010105.28. A. Collins, J. Brown, and S. Newman. In Knowing, learning, and instruction:Essays in honor of Robert Glaser.
Edited by
L.B. Resnick. Lawrence Erlbaum.Hillsdale, N.J. 1989. p. 453.29. C.H. Crouch and E. Mazur. Am. J. Phys. , 970 (2001). doi:10.1119/1.1374249.30. N. Lasry, E. Mazur, and J. Watkins. Am. J. Phys. , 1066 (2008). doi:10.1119/1.2978182.31. A. Mason and C. Singh. Am. J. Phys. , 748 (2010). doi:10.1119/1.3319652.32. A. Mason and C. Singh. The Phys. Teach. , 295 (2016). doi:10.1119/1.4947159.33. L. McDermott, P. Shaffer, and the Physics Education Group at the Universityof Washington. Tutorials in introductory physics. Pearson Publishing, Inc.2003.34. C. Singh. Am. J. Phys. , 400 (2008). doi:10.1119/1.2837812.35. E. Marshman and C. Singh. Eur. J. Phys. , 024001 (2016). doi:10.1088/0143-0807/37/2/024001.36. C.S. Kalman, M. Milner-Bolotin, and T. Antimirova. Can. J. Phys. , 325(2010). doi:10.1139/P10-024.37. C. Singh. Am. J. Phys. , 446 (2005). doi:10.1119/1.1858450.38. C. Singh. In Proceedings of the 2002 Phys. Ed. Res. Conf., Boise. 2002. p. 67.2002. doi:10.1119/perc.2002.pr.017.39. R. Sayer, E. Marshman, and C. Singh. In Proceedings of the 2016 Phys. Educ.Res. Conf., Sacramento, Calif. 2016. p. 304. doi:10.1119/perc.2016.pr.071.40. P. Black and D. Wiliam. Assessment in Education , 7 (1998). doi:10.1080/0969595980050102.41. R.F. Moll and M. Milner-Bolotin. Can. J. Phys. , 917 (2009). doi:10.1139/P09-048.42. G. Novak, E.T. Patterson, A. Gavrin, and W. Christian. Just-in-time teaching:Blending active learning with web technology. Prentice Hall, Upper SaddleRiver, N.J.1999.43. R. Sayer, E. Marshman, and C. Singh. Phys. Rev. Phys. Educ. Res. , 020133(2016). doi:10.1103/PhysRevPhysEducRes.12.020133.44. C.J. Brame. Flipping the classroom. Vanderbilt University Center for Teach-ing. Available from https://cft.vanderbilt.edu/guides-sub-pages/flipping-the-classroom/.45. Perusall. Available from https://perusall.com/.46. D.L. Schwartz and J.D. Bransford. Cognition Instruct. , 475 (1998). doi:10.1207/s1532690xci1604_4.47. R. Mayer. In Multimedia Learning. Cambridge Press. 2001.48. Z. Chen, T. Stelzer, and G. Gladding. Phys. Rev. ST Phys. Educ. Res. , 010108(2010). doi:10.1103/PhysRevSTPER.6.010108.49. Z. Chen and G. Gladding. Phys. Rev. ST Phys. Educ. Res. , 010111 (2014).doi:10.1103/PhysRevSTPER.10.010111.50. M.C. Kim and M.J. Hannafin. Comput. Educ. , 403 (2011). doi:10.1016/j.compedu.2010.08.024.51. F. Reif and L.A. Scott. Am. J. Phys. , 819 (1999). doi:10.1119/1.19130.52. N. Schroeder, G. Gladding, B. Gutmann, and T. Stelzer. Phys. Rev. ST Phys.Educ. Res. , 010103 (2015). doi:10.1103/PhysRevSTPER.11.010103.53. R. Azevedo. Educ. Psychol. , 193 (2005). doi:10.1207/s15326985ep4004_1.54. G. Gladding, B. Gutmann, N. Schroeder, and T. Stelzer. Phys. Rev. ST Phys.Educ. Res. , 010114 (2015). doi:10.1103/PhysRevSTPER.11.010114.55. M. Bower, B. Dalgarno, G.E. Kennedy, M.J.W. Lee, and J. Kenney. Comput.Educ. , 1 (2015). doi:10.1016/j.compedu.2015.03.006.56. V. Chandra and J.J. Watters. Comput. Educ. , 631 (2012). doi:10.1016/j.compedu.2011.09.010.57. C.-C. Kulik, J.A. Kulik, and R.L. Bangert-Drowns. Rev. Educ. Res. , 265(1990). doi:10.3102/00346543060002265.58. C.C. Kulik and J.A. Kulik. Comput. Hum. Behav. , 75 (1991). doi:10.1016/0747-5632(91)90030-5.59. J. Kulik. In Technology assessment in education and training.
Edited by
E. Baker and H. O’Neil, Jr. Routledge, New York. 1994. p. 9.60. R. Azevedo, J.T. Guthrie, and D. Seibert. J. Educ. Comput. Res. , 87 (2004).doi:10.2190/DVWX-GM1T-6THQ-5WC7.61. D.C. Moos and R. Azevedo. Instr. Sci. , 203 (2008). doi:10.1007/s11251-007-9028-3.62. J.A. Greene, L. Costa, J. Robertson, Y. Pan, and V.M. Deekens. Comput. Educ. , 1027 (2010). doi:10.1016/j.compedu.2010.04.013.63. C.-M. Chen and C.-H. Wu. Comput. Educ. , 108 (2015). doi:10.1016/j.compedu.2014.08.015.64. C. Singh. In Proceedings of 2003 Phys. Ed. Res. Conf., Madison, WI. AIPPublishing, Melville, N.Y. 2003. p. 177. doi:10.1063/1.1807283.
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy
Pagination not final (cite DOI) / Pagination provisoire (citer le DOI) C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy .
5. C. Singh and D. Haileselassie. J. Coll. Sci. Teaching, In , 1213(2011). doi:10.1126/science.1204820. PMID:21636776.69. L. Breslow, D. Pritchard, J. DeBoer, G. Stump, A. Ho, and D. Seaton. Res.Pract. Assess. , 114 (2015). doi:10.1119/1.4905816.71. D. Hestenes, M. Wells, and G. Swackhamer. The Phys. Teach. , 141 (1992).doi:10.1119/1.2343497.72. D.P. Maloney, T.L. O’Kuma, C.J. Hieggelke, and A. Van Heuvelen. Am. J.Phys. , s12 (2001). doi:10.1119/1.1371296.73. C. Singh and D. Rosengrant. Am. J. Phys. , 607 (2003). doi:10.1119/1.1571832.74. L.G. Rimoldini and C. Singh. Phys. Rev. ST Phys. Educ. Res. , 010102 (2005).doi:10.1103/PhysRevSTPER.1.010102.75. L. Ding, R. Chabay, B. Sherwood, and R. Beichner. Phys. Rev. ST Phys. Educ.Res. , 010105 (2006). doi:10.1103/PhysRevSTPER.2.010105.76. C. Singh and D. Rosengrant. Students’ conceptual knowledge of energy andmomentum, Proc. Phys. Educ. Res. Conf., Rochester, (2001) p. 123. doi:10.1119/perc.2001.pr.018.77. J. Li and C. Singh. Eur. J. Phys. , 025702 (2017). doi:10.1088/1361-6404/38/2/025702.78. C. Singh. Am. J. Phys. , 923 (2006). doi:10.1119/1.2238883.79. A Schoenfeld. In Handbook for research on mathematics teaching andlearning. McMillan, N.Y. 1992.80. J.I. Heller and F. Reif. Cognition and Instruct. , 177 (1984). doi:10.1207/s1532690xci0102_2.81. H. Ginsberg and S. Opper. Piaget’s theory of intellectual development. Pren-tice Hall, Englewood Cliffs. 1969.82. G.J. Posner, K.A. Strike, P.W. Hewson, and W.A. Gertzog. Sci. Educ. , 211(1982). doi:10.1002/sce.3730660207.83. G. Glass and K. Hopkins. Statistical methods in education and psycholog y .3rd ed. Pearson. 1996. 84. J. Bergmann and A. Sams. Flip your classroom: Reach every student in every classevery day. International Society for Technology in Education. Eugene, Ore. 2012.85. J. Kerssen-Griep. Com. Educ. , 256 (2001). doi:10.1080/03634520109379252.86. S.B. Seidel and K.D. Tanner. CBE-Life Sciences Education, , 586 (2013).doi:10.1187/cbe-13-09-0190. PMID:24297286.87. K.D. Tanner. CBE-Life Sciences Education, , 322 (2013). doi:10.1187/cbe.13-06-0115. PMID:24006379.88. R. Felder and R. Brent. Teaching and learning STEM: A practical guide.Jossey-Bass, San Francisco, Calif. 2016.89. M. Boekaerts. In The nature of learning: Using research to inspire practice.pp. 91–111. 2010. doi:10.1787/9789264086487-6-en.90. D.E. Ellis. Students’ responses to innovative instructional methods: explor-ing learning-centred methods and barriers to change. UWSpace. 2013.91. P.R. Pintrich. J. Educ. Psych. , 59(2004). doi:10.3102/00346543074001059.95. J.D. Klein and H.L. Schnackenberg. Contemp. Educ. Psychol. , 332 (2000).doi:10.1006/ceps.1999.1013. PMID:10873376.96. G.A. Smith. National Teaching and Learning Forum, , 1 (2008). doi:10.1002/ntlf.10101.97. B. Oakley. A mind for numbers: how to excel in math and science (even ifyou flunked algebra). Penguin. 2014.98. T. Frank. Study less study smart: A 6-minute summary of Marty Lobdell’slecture - College Info Geek. 2015. Available from https://youtu.be/23Xqu0jXlfs.99. Z. Hrepic, D. Zollman, and S. Rebello. Students’ understanding and percep-tions of the content of a lecture. AIP Conf. Proc. , 189 (2004). doi:10.1063/1.1807286.100. C. Henderson, M. Dancy, and M. Niewiadomska-Bugaj. Phys. Rev. ST Phys.Educ. Res. , 020104 (2012). doi:10.1103/PhysRevSTPER.8.020104. Pagination not final (cite DOI) / Pagination provisoire (citer le DOI)
Karim et al. 9Published by NRC Research Press C a n . J . P hy s . D o w n l o a d e d fr o m . n r c r e s ea r c hp r e ss . c o m by UN I V E R S I T Y O F P I TT S B U R GH on / / F o r p e r s on a l u s e on l yy