The role of pedagogical tools in active learning: a case for sense-making
RR E S E A R C H Open Access
The role of pedagogical tools in activelearning: a case for sense-making
Milo Koretsky , Jessie Keeler , John Ivanovitch ˆ and Ying Cao Abstract
Background:
Evidence from the research literature indicates that both audience response systems (ARS) andguided inquiry worksheets (GIW) can lead to greater student engagement, learning, and equity in the STEMclassroom. We compare the use of these two tools in large enrollment STEM courses delivered in different contexts,one in biology and one in engineering. Typically, the research literature contains studies that compare studentperformance for a group where the given active learning tool is used to a control group where it is not used. Whilesuch studies are valuable, they do not necessarily provide thick descriptions that allow instructors to understandhow to effectively use the tool in their instructional practice. Investigations on the intended student thinkingprocesses using these tools are largely missing. In the present article, we fill this gap by foregrounding theintended student thinking and sense-making processes of such active learning tools by comparing their enactmentin two large-enrollment courses in different contexts.
Results:
The instructors studied utilized each of the active learning tools differently. In the biology course, ARSquestions were used mainly to “ check in ” with students and assess if they were correctly interpreting andunderstanding worksheet questions. The engineering course presented ARS questions that afforded students theopportunity to apply learned concepts to new scenarios towards improving students ’ conceptual understanding. Inthe biology course, the GIWs were primarily used in stand-alone activities, and most of the information necessaryfor students to answer the questions was contained within the worksheet in a context that aligned with adisciplinary model. In the engineering course, the instructor intended for students to reference their lecture notesand rely on their conceptual knowledge of fundamental principles from the previous ARS class session in order tosuccessfully answer the GIW questions. However, while their specific implementation structures and practicesdiffered, both instructors used these tools to build towards the same basic disciplinary thinking and sense-makingprocesses of conceptual reasoning, quantitative reasoning, and metacognitive thinking. Conclusions:
This study led to four specific recommendations for post-secondary instructors seeking to integrateactive learning tools into STEM courses.
Keywords:
Active learning, Audience response systems, Guided inquiry, Reasoning, Sense-making
Introduction
Our program recently interviewed faculty candidatesfor an open position. During the interview process,each candidate was asked to conduct a 20-min teach-ing demonstration. One candidate, a tenured associateprofessor from a large, public research university, had regularly taught core courses. He enthusiasticallystated that he had incorporated active learning intohis courses and asked to use clickers as part of thedemonstration. In the first 15 min, the candidate de-livered a transmission-oriented PowerPoint presenta-tion on heat transfer. This lecture portion wasfollowed with a multiple-choice clicker question. Inthe question, the instructor provided a short wordproblem related to the material and an equation thathe had just presented. He asked the audience to selectwhich variable in the equation was the unknownamong a list of variables that appeared in the * Correspondence: [email protected] ˆ Deceased School of Chemical, Biological, and Environmental Engineering, OregonState University, Corvallis, OR 97331, USAFull list of author information is available at the end of the article
International Journal ofSTEM Education © The Author(s). 2018
Open Access
This article is distributed under the terms of the Creative Commons Attribution 4.0International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, andreproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link tothe Creative Commons license, and indicate if changes were made.
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18 https://doi.org/10.1186/s40594-018-0116-5 quation. All the information needed to answer thequestion was provided in the question stem, and itcould clearly be answered simply by variable identifi-cation, independently of understanding the materialthat had been presented earlier. More insidiously, thisquestion reinforced an undesirable schooling practiceof many students — searching a source to find an ap-propriate equation and variable matching. Whenasked his objective for incorporating clickers into hiscourse, the candidate stated, “ I just want to make suremy students are awake. ” Motivated by the overwhelming evidence that demon-strates the effectiveness of active learning over trad-itional lecture in science, technology, engineering, andmathematics (STEM) courses (e.g., Freeman et al. 2014;Hake 1998; Prince 2004), many instructors are seekingto transform their classroom practice to incorporate ac-tive learning (Borrego et al. 2010; Felder and Brent2010). However, as illustrated by the vignette above,these instructional practices can be taken up in a rangeof ways, and the instructor ’ s conception of learning iscritical. We believe that the faculty member above choseto employ clicker technology in a way that made senseto him and that less productive enactments of activelearning can be logical interpretations of research studiesthat predominantly focus on the effectiveness of a prac-tice relative to its absence. In this qualitative, compara-tive case study, we investigate the ways experiencedinstructors choreograph such activity in their courses toproduce learning and thereby seek to provide a comple-mentary lens for instructors to productively implementactive learning in their course.We call the clicker applied in the vignette above an ac-tive learning tool . Tools are used in instruction to placestudents in an environment where they interact inintentional ways with the content and with other stu-dents and the instructor to promote learning. Tools canbe technology-oriented like the clicker technology aboveor pedagogically oriented like the guided inquiry work-sheets we describe below and often combine aspects ofboth orientations. Researchers who study the efficacy ofthese tools typically compare student performance for agroup where the given tool is used to a control groupwhere it is not used. Such research focuses on the tool ’ seffect (what learning gains does it produce?) and how touse the tool (what do instructors need to learn about touse it?). In many cases, incorporation of tools providesevidence of increased learning outcomes. However, thisavenue of research implicitly can lead to undesiredtranslations to practice based solely on considerations ofprocedure about how to use the tool, as illustrated inthe vignette above. Investigations on the intended stu-dent thinking processes (not performance gains) using these tools are largely missing. In the present article, wefill this gap by foregrounding the intended thinking andsense-making processes of such technological and peda-gogical tools.We compare the use of active learning tools in twoSTEM courses delivered in different contexts (one inbiology and the other in engineering). Both courses usethe same two tools: audience response systems (ARS)and guided inquiry worksheets (GIW). They are bothtaught by instructors experienced with active learningpedagogies and recognized as high-quality and innova-tive instructors by their peers and students. We are in-terested in how implementation of these tools variedbetween courses and in identifying threads common toboth. We focus on the intended student sense-makingand thinking processes as the instructors integrate thetools into their courses. By sense-making, we followCampbell et al. (2016) to mean that learners are “ work-ing on and with ideas — both students ’ ideas (includingexperiences, language, and ways of knowing) and au-thoritative ideas in texts and other materials — in waysthat help generate meaningful connections ” (p. 19). Ourgoal is not to compare learning gains in these twocourses in order to claim one instructor ’ s implementa-tion strategy works better than the other. Rather throughanalysis of the similarities and differences in the coursedesign and practices, we seek to provide a lens into howactive learning comes to “ life, ” and to provide instructorsproductive ways to think about how they can best inte-grate active learning into their classroom learningenvironment.We ask the following research questions. In two large-enrollment undergraduate STEM courses in differentdisciplinary contexts:
1. What types of thinking and sense-making processesdo instructors intend to elicit from students duringtheir use of ARS questions? During their use ofGIW questions? What are the similarities and dif-ferences between the intended uses of these tools inthe two courses studied?2. In what ways do the intended sense-making pro-cesses that are elicited through the use of the ARSand GIW tools align with the instructors ’ broaderperspectives and beliefs about the instructional sys-tem for their courses? Background
To situate this study, we first provide an overview of theresearch on ARS and GIW tools. We then describe thethinking and sense-making processes on which we willfocus to understand the ways that the instructors in thisstudy use the tools in concert and how they integratethem to achieve outcomes of instruction.
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 2 of 20 udience response systems as tools
ARS, such as clickers, have been used increasingly inpost-secondary STEM classroom to allow instructors toshift large classes from a transmission-centered lecturemode into active learning environments (Freeman et al.2014; Hake 1998; Prince 2004). Typically, the instructorprovides the class a multiple-choice conceptual question,and each student in the class responds by selecting ananswer on a device. In some cases, students are alsoasked to provide written explanations justifying their an-swer selection (Koretsky et al. 2016). Aggregate re-sponses are available for the instructor to display to theclass in real time. Often, students are asked to discussanswers in small groups, in a whole class discussion, orboth (Nicol and Boyle 2003).ARS tools elicit live, anonymous responses from eachindividual student allowing students in the class to an-swer new questions in a safe manner free from judgmentof peers and the instructor (Lantz 2010). In addition,real-time availability of the answer distribution can pro-vide immediate and frequent feedback and allows foradaptable instruction. Based on student responses, in-structors can modify class discussion and activity tomeet learning needs that are more representative of theentire class rather than just a few vocal students. How-ever, instructors also have concerns about incorporatingARS in classes, including fear about covering less con-tent, less control in the student-centered classroom, andthe time and effort needed to learn the technology anddevelop good questions (Caldwell 2007; Duncan 2005;MacArthur and Jones 2008).The research literature on ARS use has focusedbroadly on both student engagement and student learn-ing. Synthesis of individual research studies has shiftedfrom more descriptive review papers (Caldwell 2007;Duncan 2005; Fies and Marshall 2006; Kay and LeSage2009; MacArthur and Jones 2008) to more systematicmeta-analyses (Castillo-Manzano et al. 2016; Chien et al.2016; Hunsu et al. 2016; Nelson et al. 2012) that usecommon metrics and statistical methods to relate thecharacteristics and findings of a set of studies that areselected from explicit criteria (Glass et al. 1981). In gen-eral, researchers report ARS tools promote student en-gagement by improved attendance, higher engagementin class, and greater interest and self-efficacy (Caldwell2007; Kay and LeSage 2009; Hunsu et al. 2016) and alsosuggest that anonymity increases engagement (Boscardinand Penuel 2012; Lantz 2010).Research on student learning with ARS tools oftentakes an interventionist approach, comparing classes orsections where instructors use the ARS to those thatonly lecture (Chien et al. 2016; Castillo-Manzano et al.2016) or, occasionally, contrasting ARS technology withthe same in-class questions delivered without using ARS technology, such as by raising hands, response cards, orpaddles (Chien et al. 2016; Elicker and McConnell 2011;Mayer et al. 2009). Learning gains are often measuredfrom instructor-developed assessments, such as in-classexams (Caldwell 2007), but more robust psychometricinstruments such as concept inventories have also beenused (Hake 1998). Results generally, but not always,show improved outcomes (Hunsu et al. 2016; Boscardinand Penuel 2012). These reports also acknowledge therelationship between ARS use, and learning is complex(Castillo-Manzano et al. 2016). Many factors have beensuggested to influence it, such as the depth of the in-structor ’ s learning objectives (Hunsu et al. 2016), testingeffects (Chien et al. 2016; Lantz 2010; Mayer et al. 2009),and the extent of cognitive processing (Beatty et al.2006; Blasco-Arcas et al. 2013; Mayer et al. 2009; Lantz2010) and social interactions (Blasco-Arcas et al. 2013;Chien et al. 2016; Penuel et al. 2006).In summary, there is large and growing body of litera-ture that has examined the use of ARS tools in STEMcourses. These studies suggest that they are effective ineliciting student engagement and learning, especially inlarge classes. Guided inquiry worksheets as tools
GIW are material tools that guide students throughinquiry learning during class. In general, inquiry learningseeks to go beyond content coverage and engage stu-dents in the practices of doing science or engineering, e.g., investigating a situation, constructing and revising amodel, iteratively solving a problem, or evaluating a so-lution (National Research Council 1996; de Jong andVan Joolingen 1998). However, inquiry can be challen-ging for students since it requires a set of scienceprocess skills (e.g., posing questions, planning investiga-tions, analyzing and interpreting data, providing expla-nations, and making predictions) in addition to contentknowledge (National Research Council 2011; Zachariaet al. 2015). In guided inquiry, instructional scaffoldsprovide support to help students effectively engage inscientific practices around inquiry (Keselman 2003; deJong 2006). There have been several pedagogies thatembody inquiry learning which range from less guidedapproaches like problem-based learning (PBL) to moreguided approaches like process-oriented guided inquirylearning (POGIL, Eberlein et al. 2008).Guided inquiry learning activities are pedagogicallygrounded and guide students through specific precon-ceived phases of inquiry (Pedaste et al. 2015). For ex-ample, both POGIL (Bailey et al. 2012) and peer-ledteam learning (PLTL, Lewis and Lewis 2005, 2008; Lewis2011) are designed to guide students through a three-phase learning cycle (Abraham and Renner 1986): (i) the exploration phase where students search for patterns
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 3 of 20 nd meaning in data/models; (ii) the invention phase toalign thinking around an integrating concept; and (iii)the application phase to extend the concept to new situ-ations. Similarly, pedagogically grounded inquiry-basedlearning activities (IBLAs, Laws et al. 1999; Prince et al.2016) contain three phases intended to produce a cogni-tive conflict that elicits students to confront core con-ceptual ideas: (i) the prediction phase where studentsmake predictions about counter-intuitive situations; (ii)the observation phase where they observe an experimentor conduct a simulation; and (iii) the reflection phase which consists of writing about the differences and con-necting to theory.GIW are commonly used as tools to provide carefullycrafted key questions that guide students through theconceived phases of inquiry during class (Douglas andChiu 2009; Eberlein et al. 2008; Farrell et al. 1999; Lewisand Lewis 2008). Lewis (2011) describes GIW as typic-ally containing from six to 12 questions that vary be-tween a conceptual and procedural nature. Questionsoften progress in complexity (Bailey et al. 2012; Hansonand Wolfskill 2000). First, they might ask students to ex-plore a concept, thereby activating their prior know-ledge. Then, they ask students to interact with modelsand develop relationships and finally elicit students toapply the learned concepts to new situations, therebygeneralizing their knowledge and understanding. Wheninquiry is centered on observations of a phenomenon,GIW provide a tool for students to write down boththeir initial predictions and their observations, therebyproducing a written record that they must reconcile(Prince et al. 2016).Similar to findings on ARS tools, the research litera-ture indicates that guided inquiry pedagogies promoteengagement (Abraham and Renner 1986; Bailey et al.2012), learning (Abraham and Renner 1986; Lewis 2011;Prince et al. 2016; Wilson et al. 2010), and equity (Lewisand Lewis 2008; Lewis 2011; Wilson et al. 2010) in theSTEM classroom.
Thinking and sense-making processes
Our study situates the intersection of pedagogical strat-egies and content delivery in the intended thinking andsense-making processes of students as they engage in ac-tive learning tasks. We take a constructivist perspectiveof learning (National Research Council 2000; Wheatley1991) that views new knowledge results from students ’ restructuring of existing knowledge in response tonew experiences and active sense-making. This re-structuring process is effectively carried out throughinteractions with other students in groups (Chi andWylie 2014; Cobb 1994). From this perspective, a keyaspect of instruction then becomes to create and or-chestrate these experiences. Educators can design and implement learning activitiesin ways to cultivate productive thinking and sense-making processes while delivering course content. Asemphasized in STEM 2026 , a vision for innovation inSTEM education , “ [a]lthough the correct or well-reasoned answer or solution remains important, STEM2026 envisions focus on the process of getting to the an-swer, as this is critical for developing and measuring stu-dent understanding. ” (Tanenbaum 2016, p. 33). Conceptual reasoning
In this study, conceptual reasoning refers to the reason-ing processes where individuals and groups draw onfoundational disciplinary concepts and apply them innew situations (National Research Council 2000, 2013).Elements of conceptual reasoning include (but are notlimited to) identifying appropriate concepts when ana-lyzing a new problem or situation, understanding thoseconcepts and their relationship to the context, and ap-plying the concepts to solve problems or explain phe-nomena. (Russ and Odden 2017; Zimmerman 2000).Facility with concepts and principles has been identifiedas a feature of thinking that distinguishes disciplinary ex-perts from novices (National Research Council 2000).Researchers have suggested several changes from trad-itional instructional design that better align with devel-oping students ’ conceptual reasoning (e.g., Chari et al.2017; National Research Council 2000, 2013). First, in-struction should shift to more in-depth analysis of fewertopics that allows focus and articulation of key, cross-cutting concepts (National Research Council 2013). Indoing so, the curriculum must provide sufficient numberof cases to allow students to work with the key conceptsin several varied contexts within a discipline (NationalResearch Council 2000). Second, classroom activitiesshould provide students with opportunities to practiceconceptual reasoning on a regular basis. Instructors canprompt this practice by asking students questions whichrequire conceptual reasoning. They should also hold stu-dents accountable for such reasoning by participating indiscussion, modeling thinking, and steering studentsaway from rote procedural operations towards concep-tual reasoning (Chari et al. 2017). Quantitative reasoning
Quantitative reasoning addresses analysis and interpret-ation of numerical data and application of quantitativetools to solve problems (Grawe 2016), as well as math-ematical sense-making — the process of seeking coher-ence between the structure of the mathematicalformalism and the relations in the real world (Dreyfuset al. 2017; Kuo et al. 2013). Quantitative reasoning hasbeen recognized as a key learning outcome for twenty-first century college graduates (Association of American Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Quantitative reasoning addresses analysis and interpret-ation of numerical data and application of quantitativetools to solve problems (Grawe 2016), as well as math-ematical sense-making — the process of seeking coher-ence between the structure of the mathematicalformalism and the relations in the real world (Dreyfuset al. 2017; Kuo et al. 2013). Quantitative reasoning hasbeen recognized as a key learning outcome for twenty-first century college graduates (Association of American Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 4 of 20 olleges and Universities 2005). Quantitative reasoningat the college level includes processes such as translatingbetween verbal, graphical, numeric, and symbolic repre-sentations; interpreting measured data or mathematicalmodels; and using mathematical methods to numericallysolve problems (Engelbrecht et al. 2012; MathematicalAssociation of America 1994; Zimmerman 2000). Theword “ reasoning ” suggests the synthesis of quantitativeconcepts into the greater whole (Mathematical Associ-ation of America 1994) and emphasizes the process ofperforming the analysis instead of merely the product that results from it. In the context of upper divisioncollege courses, quantitative reasoning tends to be evenmore sophisticated where “ the connections betweenformalism, intuitive conceptual schema, and the physicalworld become more compound (nested) and indirect ” (Dreyfus et al. 2017, p. 020141-1).Quantitative reasoning reflects the incorporation ofmathematical knowledge and skills into disciplinary con-texts (Mathematical Association of America 1994). In sci-ence and engineering, quantitative reasoning can includemaking sense of measured data and connecting it tophysical phenomena (Bogen and Woodward 1988) or de-veloping mathematical models that predict and generalize(Lehrer 2009; Lesh and Doerr 2003). Thus, the use ofmathematics extends beyond the procedures and algo-rithms that students sometimes take as synonymous withthe field. Researchers claim that mathematical sense-making is possible and productive for learning and prob-lem solving in university science and engineering courses(e.g., Dreyfus et al. 2017; Engelbrecht et al. 2012).In addition, conceptual reasoning and quantitative rea-soning are intertwined in disciplinary practice andshould be cultivated in tandem. Researchers have identi-fied several ways that conceptual reasoning aids in sci-ence and engineering problem solving, includingconceptualization for finding the equations to mathem-atically solve the problem, checking and interpreting theresult after the equation is solved, and the processesworking through to the solution (Kuo et al. 2013). Zim-merman (2000) points out that domain-specific con-cepts, i.e., “ thinking within the discipline, ” and domain-general quantification skills (e.g., evaluating experimen-tal evidence) “ bootstrap ” each other and, when con-ducted together, lead to deeper understanding and richerdisciplinary knowledge and skills.In a culture that often focuses on and rewards proced-ural proficiency, it can be challenging to engage studentsin quantitative reasoning (Engelbrecht et al. 2012). Ac-tive learning strategies can help (Grawe 2016). Strategiesinclude emphasizing accuracy relative to precision, ask-ing students to create visual representations of data ortranslate between representations, asking students tocommunicate about their quantitative work, and setting assignments in an explicit, real-world context (Bean2016; Grawe 2016; MacKay 2016). Metacognitive thinking
Metacognition often refers to “ thinking about thinking ” or “ second-order-thinking, ” the action and ability to re-flect on one ’ s thinking (Schoenfeld 1987). Research evi-dence suggests that metacognition develops graduallyand is as dependent on knowledge as experience (Na-tional Research Council 2000). Ford and Yore (2012) ar-gued that critical thinking, metacognition, and reflectionconverge into metacognitive thinking and can improvethe overall level of one ’ s thinking.Vos and De Graaff (2004) claimed that active learningtasks, such as working on projects in engineeringcourses, do not just require metacognitive knowledgeand skills but also encourage the development of thelearners ’ metacognitive thinking. Based on several de-cades of research literature, Lin (2001) concluded thatthere are two basic approaches to developing students ’ metacognitive skills: training in strategy and designingsupportive learning environments.Veenman (2012) pointed out three principles for thesuccessful instruction of metacognitive thinking: (1)Metacognitive instruction should be embedded in thecontext of the task; (2) learners should be informedabout the benefit of applying metacognitive skills; and(3) instruction and training should be repeated over timerather than being a one-time directive. When designingSTEM curriculum in an integrated way, an issue that be-comes central is determining the aspects of metacogni-tion and the context in which the aspects should betaught (Dori et al. 2017). Methods
To answer our research questions, we use a comparativecase study of two STEM courses implementing both anARS and GIW. Data for this study were collected within alarger institutional change initiative whose goal is to im-prove instruction of large-enrollment STEM coursesacross disciplinary departments through implementationof evidence-based instructional practices (Koretsky et al.2015). Through multiple data sources, we seek to providea thick description (Geertz 1994) of how and why instruc-tors use these active learning tools in large classes.
Case selection
We selected courses based on the regular use of ARSand GIW tools as part of classroom instruction. Inaddition, we sought courses in different disciplinary con-texts and department cultures since the instructorswould more likely show variation in tool use. Based onthese criteria, we identified courses in biology, in engin-eering, and in a third STEM discipline. Based on the
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 5 of 20 nstructor ’ s willingness to cooperate, we ended up inves-tigating the biology and the engineering course in thisstudy. Both are taught in the same public university,have large student enrollments, and are required coursesfor students majoring in the respective disciplines. Bothinstructors had experience using these tools for severalterms prior to our investigation and were identified bypeers and students as excellent educators.The biology course, Advanced Anatomy and Physi-ology , is the third course in an upper division sequenceseries required for biology majors. Prerequisites for thecourse included the completion of the preceding coursesin the year-long sequence and concurrent or previousenrollment in the affiliated lab section. The enrollmentwas 162 students in the term studied. The engineeringcourse,
Material Balances , is the first course of a re-quired three-course sequence for sophomores majoringin chemical, biological, and environmental engineering.The enrollment was 307 students in the term studied.
Data collection and analysis
Data sources include a series of interviews with each in-structor, classroom observations and instructional arti-facts, and student response records to ARS questions.
Instructor interview protocol
We conducted four semi-structured interviews with eachinstructor over the span of three academic years. Thefirst (year 1) and fourth (year 3) interviews were reflect-ive interviews that probed the instructors ’ general teach-ing practices and instructional beliefs. They includedquestions about department and college duties, interac-tions with other faculty regarding teaching and learning,perceptions about successful students, and responses tothe larger change initiative. The second and third inter-views (year 2) focused specifically on the ARS and GIWquestions, respectively, applied to a specific delivery ofthe instructor ’ s course and are described in more detailbelow. All interviews were audio-recorded.The interviews on the ARS and GIW questions soughtto elicit the instructor ’ s understandings of the questionsthey assigned and their rationale for assigning them, thereasons and purposes of why they used the tool (ARS orGIW), and how they used these tools in the greater con-text of their courses. To investigate the intended typesof student thinking processes for each active learningtool, we asked the instructors to write out their solutionsto the questions following a think-aloud protocol (Erics-son 2006). For these interviews, each instructor wasinterviewed by a researcher who had deep domainknowledge in the course content under study. The re-searcher provided the instructors with hard copies of se-lected ARS questions (Interview 2) and GIW questions(Interview 3) and asked them to write their responses on them, which were then collected for analysis. To gaininsight into the instructor ’ s perception of the questionswhen he or she positioned him- or herself as a learner,we began each interview with the following prompt: “ Iwant you to imagine you ’ re looking at these questionsfrom a student ’ s perspective, and I want you to talkthrough how you would answer each one. ” The think-aloud portion was followed by a reflective portion. Afterthe instructors worked through all the questions, theywere directed through the question set a second timewith the prompt: “ What was your rationale when assign-ing this question? ” Selection of ARS questions for think-aloud interview
For the interview with the biology instructor, we decidedit was feasible, given the brevity of the biology ARSquestions, to have the instructor work through all 31ARS questions delivered in the Friday POGIL sessions inan hour-long interview. The engineering ARS questionsrequired more time than the biology questions, and wedecided it was not feasible to expect the instructor towork through 31 ARS questions in an hour-long inter-view. We chose a subset of diverse questions that wererepresentative of the whole set. Criteria for choosingquestions included the difficulty of the question (deter-mined by the percent correct scores from the students),the topic of each question (based on the topics outlinedin the course syllabus), question clarity, and how thepercent correct changed if peer instruction was involved.Following these criteria, we selected 15 of the 31 ques-tions to present to the engineering instructor.
Selection of GIW questions for think-aloud interview
During interviews, we asked the instructors to engagewith GIW questions from selected weekly worksheets.We selected a single guided inquiry worksheet for thethink-aloud interview with the biology instructor. Theworksheet focused on parameters of vascular function.We chose this worksheet based on the instructor ’ s inputthat it was representative of the type of worksheets stu-dents would encounter during a guided inquiry sessionfor her course. For the engineering course, the instructorexpressed two approaches to worksheet development.One was more conceptually based and one was morecomputational. We chose two worksheets, one for eachapproach. The first worksheet was selected because itwas the first worksheet that applied the concept of a ma-terial balance (conservation of mass); this concept wasthen incorporated into almost all of the subsequentguided inquiry worksheets in the class. The secondworksheet was a later-term worksheet that asked stu-dents to use Excel to perform calculations and then an-swer questions involving quantitative and qualitativeanswers. We first asked the instructors to answer the Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
During interviews, we asked the instructors to engagewith GIW questions from selected weekly worksheets.We selected a single guided inquiry worksheet for thethink-aloud interview with the biology instructor. Theworksheet focused on parameters of vascular function.We chose this worksheet based on the instructor ’ s inputthat it was representative of the type of worksheets stu-dents would encounter during a guided inquiry sessionfor her course. For the engineering course, the instructorexpressed two approaches to worksheet development.One was more conceptually based and one was morecomputational. We chose two worksheets, one for eachapproach. The first worksheet was selected because itwas the first worksheet that applied the concept of a ma-terial balance (conservation of mass); this concept wasthen incorporated into almost all of the subsequentguided inquiry worksheets in the class. The secondworksheet was a later-term worksheet that asked stu-dents to use Excel to perform calculations and then an-swer questions involving quantitative and qualitativeanswers. We first asked the instructors to answer the Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 6 of 20
IW questions as if they were students encounteringthe worksheet for the first time and subsequently askedthem to explain their rationale for placing each questionon the worksheet.
Interview analyses
We transcribed all the interviews verbatim and analyzedinterviewee responses using emergent coding. For thethink-aloud interviews, this process was used to identifythe general intended thinking processes that occurred asinstructors worked and talked through ARS and GIWquestions from the perspective of the students in theircourse. We examined each ARS or GIW question re-sponse the instructors gave and identified individualsteps. We assigned code to each step describing whatthe instructor was doing at that step. Then, we recog-nized sets of individual steps from different questionsbelong to more general categories that represented thebroader type of thinking and sense-making processes de-scribed in our theoretical framework (i.e., conceptualreasoning, quantitative reasoning, and metacognitivethinking). In such cases, we grouped them into a moregeneral code category. For example, codes such as “ usegraphical information to qualitatively explain situation, ”“ relate information to physical representation, ” and “ identify relationships between variables ” were grouped in the more general code category “ conceptual reason-ing. ” Similarly, codes such as “ develop equations to de-scribe phenomena, ” “ rearrange equation to identifyrelationships between variables, ” and “ perform calcula-tions ” were grouped together in the more general codecategory “ quantitative reasoning. ” The approach of iden-tifying categories from specific thinking processes led toa reliable coding process. By grouping, we were able todevelop a general set of codes that connect the data withour theoretical framework. We could then comparethinking processes (i) between the courses representingdifferent disciplines and (ii) between different peda-gogical tools within each course. Table 1 provides thefinal set of code categories for the intended thinkingprocesses during the think-aloud interviews including adescription of the code and a sample interview quote foreach code.For the reflective interviews, we sought to relate theways the instructors used the ARS and GIW tools totheir priorities about what students would learn andtheir conceptions about how students learn and the roleof teaching in learning (Thompson 1992). Code categor-ies were determined as follows. One researcher codedthe interview transcript initially and developed a set ofemergent categories based on elements of the instruc-tional system that the instructors mentioned. The
Table 1
Code definitions and example responses from for ARS questions
Code Description Example of thinking processImmediate recall Answering the question from work completed during theimmediate class session “ So to answer this, I think I would look to my worksheet, sowhatever that model is. So that model with the, oh it ’ s graphsas I recall, so model with graphs and then I would reference myconversation that I had. ” Recognizeconcept(s) Explicitly identifying the main concept(s) involved in aquestion “ Okay, I ’ m thinking, again, degrees of freedom, but we ’ ve gotmultiple phases here and so we need to use Gibbs phaserule …” Compare availableanswers for bestchoice Reviewing available multiple choice answers to decide whichanswer made the most sense “ I think that I would have to say okay, you know, we just wentthrough these four answers from this worksheet, and, youknow, we eliminated some of these options basically as beingthe wrong answer. ” Select informationfrom question Referring to the question to obtain needed information towork towards the answer “ I ’ m gonna go back to my problem and look at umm youknow, what are the things I ’ m given umm is there somethingthat makes sense in terms to base this calculation on. ” Conceptualreasoning After identifying a concept, using fundamentals to reasonthrough to an answer (e.g., using a graphical representation oran equation to think through how the variables related to oneanother) “… think about the mass flow rate and use that to say that themass in and out is gonna be the same. So I ’ m going to, yeah,focus on that concept of mass conservation here. ” Quantitativereasoning Developing equations to describe what was happening in thequestion and also possibly using a numerical calculation. “ And in this case I want to do a material balance on C … I ’ mjust gonna write it out, in minus out plus generation minusconsumption equals zero …” Metacognitivethinking Reflecting on the context that the question is asked or themeaning or reasonableness of an answer “ Ultimate aim of the process is to produce dry crystalline,sodium bicarbonate so what did I think this process was for,looks like we ’ re actually trying to make the solid phase asopposed to reducing the concentration in the liquid phase. ” Recall lectureinformation/priorknowledge Using information presented in lecture or other courseresources to make progress “ Alright so it ’ s uhh it ’ s telling me that umm it ’ s going to beexample problem similar to one that we worked in lecture. ” Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 7 of 20 esearch team then met and reconciled the categories.This process resulted in the following subset of categor-ies that were relevant to this study: instructional scaf-folding, constructivism, social interactions, formativeassessment, summative assessment, and sense-making.For both think-aloud and reflective interviews, a singleresearcher with appropriate expertise performed thecoding. A subset of the interview transcripts (20 – ’ s kappa = 0.80 [think-aloud], 0.84 [reflective]). Other measures
We used several other data sources to triangulate ourinterview data analysis and interpretation. Both classeswere observed several times using the Teaching Dimen-sions Observation Protocol (TDOP, Hora et al. 2013).While the report of TDOP codes for each course is be-yond the scope of this article, the observation processallowed researchers to get familiar with the course struc-ture and the context in which the ARS and GIW activelearning tools were used. The observations were sup-ported with instructional artifacts including the coursesyllabus, all the ARS questions and GIW activities, andthe week ’ s lecture notes for the analyzed GIW activities.Student responses for ARS questions were collectedthrough the web-based platform that each instructorused to assign questions and receive student responses.The response data were used to verify our interpretationof the different intents of the instructors in their use ofARS questions. Context of tool use
In this section, we describe how each instructor situatesthe use of ARS and GIW tools within the activity struc-ture of their courses. This description is based on ana-lysis of the course observations and course artifacts andtriangulated with the reflective interviews.Table 2 provides a summary of the differences in thecontext and use of the active teaching tools between the biology and engineering courses. We unpack these en-actments when we present the results.
Biology course
The biology course met three times per week with moretraditional lecture periods on Monday and Wednesdayand an active learning period of POGIL guided inquirysessions on Friday. ARS questions were used in mostclass periods, whereas the GIW tool was used only inthe Friday POGIL class periods. Over the 10-week term,students answered a total of 98 ARS questions, 31 ofthem during the POGIL GIW sessions. They completednine GIW in total. An average of around 110 out of 162students attended GIW sessions.Figure 1 shows an example of a biology ARS questionafter it was delivered in class. ARS questions were pre-sented to students as PowerPoint slides, and the stu-dents answered using clickers. The instructor typicallydisplayed the distribution of students ’ choices on thescreen shortly thereafter and briefly commented on thecorrect answer. Students answered between two to fivequestions per class period. Our think-aloud interviewwith the biology instructor focused on the ARS ques-tions delivered on the Friday POGIL sessions.The instructor used a GIW tool to facilitate studentactivity during the guided inquiry sessions on Fridays. Atypical Friday session (50 min) consisted of an introduc-tion (2 min), student group work using the GIW and fa-cilitated by undergraduate learning assistants (15 –
25 min), ARS questions and group discussion (5 –
10 min), continued student group work (10 –
15 min),and wrap up (2 – Table 2
Alignment across different elements of the two courses
Biology EngineeringFrequency Size Purpose Frequency Size PurposeLecture Twice aweek,50 min 165students Content is introduced, and students engage in ARSquestions and discussions. Twice aweek,50 min 300students Content is introduced, and exampleproblems are solved.ARSquestions Threetimes aweek 165students “ Check in ” with students to see if they are correctlyinterpreting content or ask students to report theirunderstanding of worksheet questions justcompleted. Once aweek,50 min Approx.150students/section Strengthen conceptualunderstanding by building ontopics from lecture through ARSquestions about new situations.GIWquestions Once aweek,50 min 165students Guide students through biological models, enablingthem to engage with the content using disciplinarythinking. Once aweek,50 min Approx.30students/section Scaffolded GIW questions reinforceproblem solving skills/processesinstituted in lecture. Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 8 of 20 nswer questions. Some worksheets also contained “ ex-tension questions ” that students engaged in outside ofclass or in class if they finished the worksheets ahead ofother groups. Extension questions typically consisted oftwo types, classified by the instructor as (i) big picturequestions , consisting of open-ended questions to stimu-late discussion and look at larger concepts, and (ii) clin-ical correlation questions , situating concepts in clinicalapplications.Figure 2 shows the first part of the GIW tool that weused for the think-aloud interview. The worksheet con-tained three different models; each model was followedby 2 to 9 questions students answered based on informa-tion from the models. The worksheet ended with six “ extension questions ” that the students answered outsideof class time. Although these extension questions re-quired more critical thinking than the previous ques-tions, they still typically referenced the models containedin the worksheet. Fig. 1
Example of a biology ARS question as it was delivered in class
Fig. 2
The first part of the biology inquiry-based worksheet used in the think-aloud study
Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 9 of 20 ngineering course
The engineering course has two 50-min lectures on Mondayand Friday attended by all students. ARS questions were de-livered during the 50 min Wednesday sessions. GIWs wereused during the 50-min Thursday sessions. Over thecourse of the 10-week term, the students answered atotal of 31 ARS questions and completed nine GIWin total. An average of 285 out of 307 studentsresponded to questions during the ARS sessions, andalmost all students attended the GIW sessions.In the Wednesday sessions, students were divided intotwo sections of approximately 150 students and ARSquestions were delivered via the
AIChE Concept Ware-house (Koretsky et al. 2014) by the primary instructor toeach section separately. Figure 3 shows an example of anengineering ARS question. Students responded to thequestions using their laptops, smartphones, or tablets,and the answers were stored in a database. Along withtheir multiple-choice answer, students were asked toprovide a written justification of their answer as well asa confidence score between one and five to indicate howsure they were of their answer. For about half of the ARS questions (15 questions), the engineering instructorused an adaption of the peer instruction (PI) pedagogy(Mazur 1997) where students first answered a postedquestion individually, then discussed their answers withtheir neighbors, and then re-answered the same questionagain (including written justification and confidence).For the 16 non-PI questions, the instructor displayed theresponses and discussed the question with the wholeclass after the students answered individually.On Thursdays, students attended smaller “ studio ” ses-sions of approximately 30 –
36 students where they com-pleted GIW activities. Most studios were facilitated by agraduate teaching assistant (GTA) who would sometimesbriefly (1 – Fig. 3
Example of an engineering ARS question as it was delivered in class
Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 10 of 20 uiding questions, as opposed to providing students withdirect answers to the worksheet questions. Students weregiven 50 min to work on the worksheets which were col-lected at the end of each studio session.An example of part of a GIW used in the engineeringthink-aloud interview is shown in Fig. 4. Most worksheetsinvolve a main problem that includes a problem statementwith relevant information students need to use to solvethe worksheet. The problem is then broken down into dif-ferent steps that students complete to help them reach ananswer. These steps are modeled after problem solvingprocesses introduced in lecture and may include questionswhich require students to read and interpret the problemstatement, to draw and label a diagram that representswhat is happening in the problem, to formulate and solveappropriate equations, and to relate the problem state-ment to real-world engineering scenarios.
Results
In the following sections, we answer our two researchquestions with evidence from our data analysis.
Answer to RQ1: What are the instructors ’ intendedthinking processes during use of ARS and GIW tools?What are the similarities and differences betweeninstructors? Our answer to Research Question 1 is based on analysisof think-aloud interviews for the ARS questions and the GIW questions, triangulated by student responses ofARS questions, researcher in-class observations, and theinstructors ’ reflective interviews. Thinking processes elicited from the ARS and GIW questions
Table 3 shows the percentage of questions identified foreach of the categories of thinking process during thethink-aloud interviews (see Table 1 for category defini-tions). Results from the ARS questions are shown in theleft column for each course, and results from the GIWquestions are next to them to the right. The majority ofARS questions in the biology course focused on immedi-ate recall (75%). While the biology instructor expectedstudents to be able to recognize a concept in 25% of thequestions, there was no evidence that the ARS questionswere intended to evoke further conceptual reasoning. Incontrast, the engineering instructor rarely sought toelicit immediate recall (7%), but rather to provide stu-dents experiences where they needed to select informa-tion from questions (47%) and recognize concepts (73%)to prompt conceptual reasoning (73%). In addition, themajority of the questions also included elements ofquantitative reasoning (60%).During her think-aloud interview, the biology in-structor showed a wide range of intended scientificthinking processes in responding to the GIW questions,including conceptual reasoning (42%), quantitative rea-soning (42%), and metacognitive thinking (32%). These
Fig. 4
The first part of one of the engineering inquiry-based worksheets that was used in the think-aloud study
Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 11 of 20 rocesses were not sequestered but rather the instructorintegrated each one around thinking about models of thefluid dynamics of vascular function. The worksheet tendedto be “ stand alone ” with information usually found in thequestion (95%) rather than intending students to recall in-formation from lecture (21%). The biology worksheet isself-contained in the sense that most of the informationstudents need to complete the worksheet is provided viathe models. In contrast, during the think-aloud interview,the engineering instructor intended students to spend themajority of time engaged in quantitative reasoning (60%)with only a small amount of time in conceptual reasoning(5%). There is also less metacognitive thinking in these ac-tivities (10%) than that in the biology course. In addition,the engineering instructor intended students to referenceprevious knowledge and information presented in lecture(85%) to a much larger degree than the biology instructor.While the uses of ARS and GIW tools in each courseis distinct and different, inspection of Table 3 shows thatin either course by the time students completed thatweek ’ s active learning activity, they were intended to sig-nificantly engage around a key disciplinary topic in twoof the aspects of thinking and sense-making: conceptualreasoning and quantitative reasoning. Thus, the “ cover-age ” of the topic extends beyond declarative content andpatterns of problem solving. Rather, it emphasizes pro-ductive ways to think and reason in the discipline. Thebiology instructor had more explicit intended metacog-nitive thinking than the engineering instructor (32 vs.10%). However, for all 31 ARS questions, the engineeringinstructor had students rate their confidence (see Fig. 3),so while he did not allude to metacognitive thinking asmuch during the think-aloud interviews, there was someof this type of thinking built into the technology tool. Student ARS performance
We next present student performance data from theARS questions for each course. These data show differences in types of questions asked and implementa-tion and reflect differences in intended thinking pro-cesses discussed previously. Figure 5 shows thepercentage of students who answered correctly for eachquestion when the ARS questions were delivered inclass. Results from the 31 ARS questions delivered dur-ing Friday POGIL sessions in the biology course areshown chronologically with red diamonds (labeled BIO).Students performed well in general averaging 89.3% cor-rect (solid red line) with a standard deviation of 10.5. Inthe engineering course, students ’ initial responses areshown with solid dark blue circles (ENGR pre-PI). Theyaveraged 58.5% correct (solid dark blue line) on thesequestions with a standard deviation of 20.2. For the 15questions where the engineering instructor used the PIpedagogy, the post-PI question results are shown bypowder blue circles. Their average correct was 80.0%(powder blue line) with a standard deviation of 14.8%.Of the questions where PI was used, scores increased byan average of 18.0%, showing benefit of peer discussion,although there were two questions where scores signifi-cantly decreased (Q12 and Q21).The nature of the ARS questions is clearly different inthese two courses: the engineering questions were moredifficult and took more class time, and when peer in-struction pedagogy was used, they were asked twice.These differences both reflect the context where aweekly class period was dedicated to the ARS questionsin engineering and the intended thinking processes dur-ing the think-aloud interviews with the instructorsshown in Table 3. We next explore the reflective inter-views with the instructors to see how these uses alignwith their conceptions of how these tools fit into theircourses to produce learning. Instructor perceptions of the ARS and GIW questions
In this section, we present excerpts and analysis of thefour reflective interviews with each instructor: the year 1
Table 3
Percent of questions identified for coded thinking processes for ARS questions and GIW questions in biology (BIO) andengineering (ENGR) courses. Definitions of codes are presented in Table 1
Thinking process BIO ENGRARS questions(31 questions) GIW questions(19 questions) ARS questions(15 questions) GIW questions(20 questions)Immediate recall 75% 47% 7% 45%Recognize concept(s) 25% 73%Compare available answers for best choice 22% 20%Select information from question 95% 47% 65%Conceptual reasoning 42% 73% 5%Quantitative reasoning 42% 60% 60%Metacognitive thinking 32% 10%Recall lecture information/prior knowledge 21% 85%
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 12 of 20 eneral interview (labeled g-pre) and the year 3 generalinterview (g-post) focused on more general questionsabout their instructional practices and beliefs while theyear 2 post think-aloud interviews (post think-aloudARS or GIW) specifically addressed the instructors ’ in-tent in using these tools. ARS questions
The reflective interviews corroboratethe identified differences between the courses in theways the ARS questions engaged students. In the biologycourse, ARS questions were used mainly to assess if stu-dents were correctly interpreting and understandingworksheet questions or to ask students to recall the ma-terial recently introduced in the worksheet. Each of thebiology ARS questions was applied once. When ques-tioned about her rationale behind designing ARS ques-tions, the biology instructor acknowledged that they canbe a helpful tool to use in large classes.Biology Instructor (post think-aloud ARS): ...and soyou know the value of clicker questions in rooms ofgreater than 100 people is … in a really big class, Ican ’ t see their sheets, and so I don ’ t know whatthey ’ re thinking. And it ’ s really useful to check in withthem in that way . [italic added for emphasis]She alluded to the role of ARS questions as “ conceptchecking, ” and pointed out that she regularly uses themto “ check in ” with students to ensure they are engagedand following along in class.The engineering course presented ARS questions thatafforded students the opportunity to apply learned con-cepts to new scenarios towards improving students ’ con-ceptual understanding. When the engineering instructorwas asked about how he wants students to be engagedwhile solving ARS questions, he explained that he wanted to push students beyond procedural problemsolving:Engineering Instructor (post think-aloud ARS): I guesstrying to get them to start to create that knowledgestructure in their head. That there are certain conven-tions and there are certain cues that are gonna helpthem bend those problems into, you know, help themfind a solution... Trying to provide cues that are similarto things they ’ re gonna see in other parts of the class,homework and exams and so forth, to get them to honein on those specific concepts and then in some casesmanipulate or examine at a level that they ’ re notgonna get just by plugging and chugging into thoseequations. [italic added for emphasis]This line of thinking tied into comments in the year 1general reflective interview where the engineering in-structor referred several times to the ability to conceptu-ally reason by identifying a concept and applying it to anew situation, such as in the following excerpt:Engineer-ing instructor (g-pre): If you can understand thefundamentals, the fundamental concepts that aregoverning a process then, you know, if you start tochange all these other things, if you can rememberthat kind of core concept then that goes a long way tocarrying these through being able to reason through asolution , where if I just know, I just have some equationmemorized...that ’ s gonna fall apart, you know, whenyou get to a situation where that equation doesn ’ texactly apply. [italic added for emphasis]This emphasis on reasoning or sense-making from foun-dational concepts is consistent with the engineering in-structor ’ s choice to devote one class period a week toactivity around ARS questions. Fig. 5
ARS question student performance data
Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 13 of 20 he interpretation of the different use of the ARSquestions by the two instructors is consistent with theanalysis of intended thinking processes from the think-aloud interview (Table 2) and the data of percent of cor-rect responses from students (Fig. 5). The biology in-structor used the ARS as a periodic check-in withstudents whereas the engineering instructor used ARSquestions more extensively as an opportunity for stu-dents to develop their understanding and “ create thatknowledge structure ” they needed for adaptive transferand problem solving. GIW questions
As we found with the ARS questions,instructors also utilized guided inquiry worksheets dif-ferently. During the interview, we asked the biology in-structor why she recommend the specific guided inquiryworksheet shown in Fig. 2 for the think-aloud interview.She explained that she thought it was the epitome of atypical GIW for the course.Biology Instructor (post think-aloud GIW): And sowhat I really like about POGIL that … this worksheetadheres to is you can get everything you need fromthis, you know, strictly from the models, and yourbrain, and thinking about things . And maybe if youdon't know what these vessels are, yeah, you couldlook them up, but you probably do, you know, basedon where my students are at. And so, like that ’ s whatI like. This is very much a standalone.Here, she expresses how the inquiry-based worksheetsin the biology course are designed to be self-contained;there is less emphasis to connecting to the informationpresented in previous lectures or other places and moreemphasis to sense-making or as she says, “ thinkingabout things. ” In the engineering course, the GIWs were used in stu-dio sections where the larger class was broken intosmaller sections of around 30 students to work on theworksheets. During the interview, the engineering in-structor described the relationship between the GIWsand other aspects of the course and especially how theyare tied closely to information introduced in lecture.Engineering Instructor (post think-aloud GIW): I viewstudio as a really scaffolded and supported place forstudents to have their first experience applying theprinciples from lecture. So you kind of get all this in-formation and not a lot of chances to engage it in lec-ture and before you get a blank problem statementfrom the homework assignment and are left with ablank page [you get a chance] to walk through thesteps or the concepts that are gonna have to be appliedas we move forward to homework and exams . Having it be a place where they ’ ve got classmates they canbounce ideas [off of] … Here, he clarifies that he views the guided inquiry work-sheets as a useful step for students in between beingshown ways to solve problems in lecture and applyingthese problem solving methods on homework assign-ments and exams. The engineering instructor furtherelaborated how he envisions the GIW tool sitting withinthe instructional processes in the year 3 reflective inter-view:Engineering instructor (g-post): In these studioswhere students basically come in and they ’ re workingon a worksheet on a problem that ’ s related to thingsthat we ’ ve covered in class, it ’ s pretty scaffolded, butthere ’ s some open ended components, but they ’ rekind of working together in groups of three kind ofindependently with support from a T.A. during thattime.As this excerpt indicates, when considering active learn-ing tools, it is useful to consider other important aspectsof the instructional system, as we do next. Answer to RQ2: In what ways do the intended sense-making processes from the ARS and GIW tools align withthe instructors ’ broader perspectives and beliefs aboutthe instructional system for their courses? Our answer to Research Question 2 is based on analysisof the year 1 general interview (g-pre) and year 3 generalinterview (g-post).
Beliefs about ARS and GIW as active learning tools ininstructional systems
In this section, we explore more broadly what the in-structors conceive as elements of the instructional sys-tem and how the ARS and GIW active learning tools fitwithin those broader elements. Here, the conceptions ofthe two instructors generally align.Table 4 shows category codes for elements of the in-structional system and examples of the correspondinginstructor beliefs that emerged from analyzing the tworeflective interviews with each instructor. The table alsoprovides exemplar excerpts from each instructor. Bothinstructors expressed the tools provided
InstructionalScaffolding that helped guide students ’ learning. The ex-cerpt from the biology instructor indicates how she seesscaffolding from the GIW tool as necessary to providestudents “ a structure to follow, ” while the engineeringinstructor describes the role of each tool to progressivelyprovide students “ a learning unit ” that created a “ cohe-sive, weekly routine. ” Both instructors used language that was consistentwith a constructivist perspective of learning; the biologyinstructor often referred to students “ constructing their Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
InstructionalScaffolding that helped guide students ’ learning. The ex-cerpt from the biology instructor indicates how she seesscaffolding from the GIW tool as necessary to providestudents “ a structure to follow, ” while the engineeringinstructor describes the role of each tool to progressivelyprovide students “ a learning unit ” that created a “ cohe-sive, weekly routine. ” Both instructors used language that was consistentwith a constructivist perspective of learning; the biologyinstructor often referred to students “ constructing their Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 14 of 20 wn knowledge ” , and several times the engineering in-structor indicated that he aimed to help students “ de-velop knowledge structures. ” In addition, bothinstructors valued the role of
Social Interactions in con-structing understanding, expecting students “ to interact,not just with the content but with each other to makemeaning of the content ” (biology instructor) and “ talkingwith their group and grappling with the material ” (en-gineering instructor). Both instructors allude to how in-structional tools can provide the impetus for students tointeract with one another in sense-making processes.Both instructors also suggested that the data fromARS questions was useful for Formative Assessment to “ see what they ’ re [i.e., the students are] thinking andwhere the misconception might be ” (biology instructor).The ARS tool allows the instructors to have her or his “ finger on the pulse of the class ” (engineering instructor)and give students the “ opportunity to assess their ownlearning ” (engineering instructor). The engineering instructor also tied this aspect of the instructional sys-tem to Social Interactions stating ARS questions give “ them an opportunity to communicate what they ’ velearned to their peers. ” Both instructors recognized the need to develop dis-ciplinary sense-making aligned with their expressed ex-periences with
Summative Assessments . As the biologyinstructor states, “ I started crafting exams and assess-ments that were, you know, more about how could stu-dents predict, could students look at a set of data andthen make inferences from it, and I came to realize thatthey couldn ’ t really do that. ” This realization motivatedher to implement POGIL with GIW in her course tohelp students develop these skills. Similarly, the engin-eering instructor recalled a time when he received push-back from students for an exam that was perceived as “ unfair. ” He explained his “ rationale ” to them as follows: “ if you understood this, the concept from this applica-tion then...you know, I was looking to see if you could Table 4
Instructional system code categories and examples of corresponding beliefs from biology and engineering instructors
Code Biology instructor Engineering instructorInstructionalscaffolding “… with my students is that I have to give them structure [withGIW activities]. If I don ’ t give them structure to follow, they don ’ tknow what to do. ” (g-pre) “ ... trying to have a cohesive, kind of weekly routine of contentdelivery: reinforced conceptual understanding in recitation [withARS questions], scaffolded application in studio [with GIWquestions]. So kind of the idea those two lectures, the recitation,and the studio as being like a learning unit...and then thehomework follows that …” (g-post)Socialinteractions “ I expect that they talk to one another and I expect that theysynthesize information from whatever we ’ ve talked about earlierthat morning with what we ’ ve done before. ” (g-pre) “ So every day they ’ re expected to interact, not just with thecontent but with each other and to make meaning of thecontent. ” (g-post) “ I ’ ll just have [GIW] worksheets where it ’ s just things like sketchwhat you think this, you know, the relationship between thesetwo variables is, or...you know, just doing stuff where they ’ retalking with their group and grappling with the material asopposed to me. ” (g-post)Formativeassessment “ But I ’ m curious what they do know, then based on that data Iwill choose to, when we start the next day, amend the plan. If itmeans that we have to have a clicker question the next time toprobe this more thoroughly or if maybe I just got to throw it outthere and see what they ’ re thinking and where the misconceptionmight be or why they ’ re answering it the way they ’ re answeringit. ” (g-post) “ You get real feedback [from the ARS tool], so do they understandit? 70% of them do or got the right answer, and 30% don ’ t, andso you know you have your finger on the pulse of the class. Youknow, you ’ re assessing them closer to when you ’ ve cover thematerial, and you ’ re giving them an opportunity to assess theirown learning, and so that, and you ’ re giving them an opportunityto communicate what they ’ ve learned to their peers. ” (g-pre).Summativeassessment “ When I entered into graduate school, I began teaching anatomyand physiology, which I think traditionally can be looked at as avery memorization-based discipline for anatomy, but for physi-ology it ’ s process. And then I started crafting exams and assess-ments that were, you know, more about how could studentspredict, could students look at a set of data and then make infer-ences from it, and I came to realize that they couldn ’ t really dothat. ” (g-pre) “ I think it was an exam question, and you know, some studentscomplaining about it being unfair, you know, we haven ’ t coveredthis in class or whatever, and then just going through what myrationale was. Like if you understood this, the concept from thisapplication then...you know, I was looking to see if you couldtransfer it and use it over here. ” (g-pre)Sense-makingprocesses “ We need to not just be looking at content when we do that. Weneed to be thinking about what does it mean to think like abiologist? You know, what pieces are being gathered or createdhere, and how are we gonna further them with this course.Content builds, for sure. But what about the thinking like abiologist? ” (g-pre) “ When I throw my clicker questions out there I say to them, okay,I ’ m going to need you to defend your choices after thesequestions come up, I want to hear from you, so I ’ m promptingthem to be reflective about their learning, ” (g-post) “ and [answering ARS questions] they develop, I think, confidenceand a sense responsibility that, you know, I ’ m not just going to betold the answer here; I have to I have to figure out what theanswer is and I think by instilling that in them through this classand the classes that follow they develop skills that they wouldn ’ tdevelop if you were in a straight lecture classroom. ” (g-pre) “ And it ’ s kind of engineering problem solving is also, a big part of[CBEE] 211 is just being able to take a lot of information, break itup into the parts and map it to, again, those concepts that arekind of fundamental, and then use that information to come to asolution. I think those are the big things I hope they take awayfrom it. ” (g-post). Koretsky et al. International Journal of STEM Education (2018) 5:18
Page 15 of 20 ransfer it and use it over here. ” Importantly, both in-structors are holding students accountable for higher-level disciplinary thinking processes when they test stu-dents, thus aligning the sense-making processes theyseek to develop with the active learning tools to thequestions on the exams.In summary, both instructors value and seek to culti-vate sense-making processes . The biology instructor de-scribes these processes as “ thinking like a biologist ” which includes defending answer choices and promptingstudents to be reflective. The engineering instructor ex-pects students to “ figure out what the answer is ” by “ be-ing able to take a lot of information, break it up into theparts and map it to, again, those concepts that are kindof fundamental, and then use that information to cometo a [numerical] solution. ” Discussion
In this study, we investigated how two instructors usedARS and GIW tools to identify and compare the waysthat they intended for students to “ get to the answer. ” The data show that while the same active learning toolswere used in both courses, the way in which studentswere being asked to engage in problem solving andsense-making varied. In the biology course, ARS ques-tions were used primarily to “ check in ” with students tosee if they were correctly interpreting the worksheetcontent (e.g., graphs and models) or to ask students torecall the material recently introduced. In the engineer-ing course, ARS questions asked students to apply theconcepts covered in lecture to new scenarios towardsimproving students ’ conceptual understanding. Theseuses reflect the activity structure in each course. Thebiology course centered on briefly using clickers in al-most every class to support instruction (lecture orPOGIL). In the engineering course, 1 day and 25% of in-struction time was devoted to ARS questions, and theinstructor asked students to engage in deeper ways byproviding written justification and confidence.In the biology course, the GIWs were primarily usedin stand-alone activities, and most of the informationnecessary for students to answer the questions was con-tained within the worksheet. Typically, the informationwas presented in a context that aligned with a disciplin-ary model. In the engineering course, the instructorintended for students to reference their lecture notesand rely on their conceptual knowledge of fundamentalprinciples from the previous ARS class session in orderto successfully answer the GIW questions. The biologyinstructor used the worksheets as an opportunity for in-tegrated development of their conceptual reasoning,quantitative reasoning, and metacognitive thinking. Onthe other hand, the engineering instructor focused primarily on cultivating aspects of quantitative reasoningfor problem solving.In our analysis, we position ARS and GIW as toolsthat are utilized within instructional systems to producelearning. We have shown that the specific intent of thebiology instructor when she uses these tools is very dif-ferent than the engineering instructor. However, com-mon threads emerged that can be used as ways toconsider instruction with active learning tools. Both in-structors use these tools to build towards the same basicdisciplinary thinking and sense-making processes of con-ceptual reasoning, quantitative reasoning, and metacog-nitive thinking. Conceptual reasoning processes thatwere identified in the think-aloud interviews includedintending students to use graphical information to quali-tatively explain a situation, relate information to a phys-ical representation, and identify relationships betweenvariables. Quantitative reasoning processes included de-veloping equations to describe phenomena and manipu-lating equations to reveal the relationship betweenvariables. Metacognitive thinking included consideringalternative possible solution strategies and reflecting onthe reasonableness of an answer value in relation to aphysical system.Both instructors also clearly intended students to inter-weave these thinking and sense-making processes. The en-gineering course design was more sequential wherestudents engaged in conceptual reasoning processes dur-ing ARS sessions and then were expected to recall thosefoundational concepts as they were elicited to quantita-tively reason with the GIW activity in studio the followingday. The biology course design used “ POGIL Fridays ” toprovide a more integrated active learning experiencewhere conceptual reasoning, quantitative reasoning, andmetacognitive thinking were more interlocked.Both instructors clearly alluded to the value of disciplin-ary thinking processes in each of their general reflectiveinterviews. However, they did not explicitly identify con-ceptual reasoning, quantitative reasoning, or metacogni-tive thinking nor did they appear to make theseconnections in the post think-aloud interviews when theywere more specifically asked about the intent of ARS andGIW tools. Thus, the incorporation of conceptual reason-ing, quantitative reasoning, and metacognitive thinkingappears to be tacit, even for these experienced and highlyregarded instructors. We suggest more direct and explicitemphasis on the ways active learning tools elicit thesetypes of thinking would be beneficial as instructors designactivities and integrate them into courses. Causes of difference in tool use
Hypothetically, we might ask, “ If we put one of these in-structors in the other ’ s classroom, how similar wouldtheir use of the ARS and GIW tools appear in that Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Hypothetically, we might ask, “ If we put one of these in-structors in the other ’ s classroom, how similar wouldtheir use of the ARS and GIW tools appear in that Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 16 of 20 ifferent context? ” There are several legitimate avenuesof inquiry that could be pursued to answer this question.We draw from the extant literature to identify these ave-nues and assert that considering this complex questionfrom several perspectives is productive.First, we might consider the instructors ’ beliefs andknowledge. As the set of responses in Table 4 indicate,both instructors demonstrated learner-centered beliefsoriented towards learning facilitation as opposed toteacher-centered beliefs oriented towards knowledgetransmission (Prosser and Trigwell 1993). While theyshared common orientations, there could be more subtledifferences in their beliefs. Speer (2008) suggests a morefine-grained characterization of an instructor ’ s “ collec-tion of beliefs ” is needed to connect beliefs to specificinstructional design choices. Such characterization couldprovide information about why differences betweenthese instructors ’ use of the tools emerged. Alternatively,the instructors ’ designs may be influenced by theirknowledge about an educational innovation. Rogers(2003) identifies three types of knowledge needed to im-plement an innovative tool: awareness knowledge (thatthe tool exists), how-to knowledge (how to use the tool),and principles knowledge (what purpose the tool serves).In their interviews, each instructor clearly demonstratedawareness and principles knowledge, but differences inhow-to knowledge may have led to different enactmentstrategies. How-to knowledge can be tied to normativeuse in the department and in the discipline (Nortonet al. 2005). For example, there may be more (or differ-ent) access to POGIL workshops in biology than in engin-eering. Further investigation of the degree that detailedinstructor beliefs and how-to knowledge influence thechoice and use of active learning tools is warranted.Second, we might consider the different disciplinarycontexts of the courses, i.e., biology vs. engineering. TheNational Research Council (2012) reports that whilethere are many common pedagogical approaches acrossscience and engineering, there are also “ important differ-ences that reflect differences in their parent disciplinesand their histories of development ” (p. 3). Schwab(1964) argues that each discipline has a unique “ struc-ture ” leading to particular ways of thinking. Specifically,he distinguishes between thinking associated with “ disci-plined knowledge (in biology) over the know-how in thesolving of practical problems ” (p. 267) in engineering.Ford and Forman (2006) extend this framing to discip-linary practices. Each discipline has a unique set of fun-damental and central practices that need to bearticulated and incorporated into a classroom activity.These sociocultural practices provide access to disciplin-ary specific ways of thinking, knowing, and justifying.They state that a central goal of education is that stu-dents develop “ a grasp of practice ” which includes both disciplined knowledge and “ know-how ” (p. 27). This lineof inquiry suggests that investigations are needed to elu-cidate the productive ways active learning tools can sup-port disciplinary practices and the way those uses candiffer amongst STEM disciplines or among courseswithin a discipline.Third, we might consider how the active learning toolswere situated within each course ’ s schedule and institu-tional resources. The biology class met only in singlelarge-class sections and used undergraduate learning as-sistants to support POGIL Fridays. The engineeringcourse had dedicated smaller studio sections which weresupported by graduate teaching assistants. These differ-ent contexts are largely determined by how each depart-ment organized classes and support for teaching andlikely take sustained effort for an individual instructor tochange. Since each course relied upon pedagogicallytrained student instructors to engage student groupsduring the use of GIW tools, one of the instructor ’ s roleswas to orchestrate and manage an instructional team. Inlarge courses, productive ways to engage the instruc-tional team can become an integral part of incorporatingactive learning tools (Seymour 2005). In addition, eachof the student instructors brings their own knowledgeand beliefs about learning to this work (Gardner andJones 2011). Coordinated activity within the department,college, or university, such as programmatic professionaldevelopment of student instructors, can become a valu-able resource. Research is needed to better understandthe ways these greater organizational structures enableor constrain the use of active learning tools. Limitations
This study only examined the practices of two instruc-tors within the same institution. It would be useful toverify the findings with a larger sample of instructorsand courses that fit within the criteria of the study. Thisstudy focused on the intent of the instructors throughthink-aloud and reflective interviews triangulated withother data sources. In both courses, students were regu-larly doing work where they were interacting in smallgroups. It would be useful to see to what degree studentswere taking up the thinking and sense-making processesof conceptual reasoning, quantitative reasoning, andmetacognitive thinking. This take-up clearly depends onthe social aspects of learning, involving interactions be-tween the students themselves and the instructor. Itwould be useful to examine what types of moves by stu-dents promote or short-circuit these sense-making pro-cesses amongst the group as well as identifyingproductive ways for an instructor to intervene to facili-tate thinking. Finally, while the same three generalintended sense-making processes were identified in boththe biology and engineering courses, their manifestation
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 17 of 20 ndoubtedly depends on the nature of the specific prac-tices of each discipline. Articulation of the specific waysthat practicing biologists and engineers engage in discip-linary sense-making could inform more productive usesof these active learning tools.
Recommendations
This study has led to the following recommendations forpost-secondary instructors seeking to integrate activelearning tools into STEM courses:
Recommendation 1: When transitioning to activelearning, it is common to think about instructionalchoices in terms of “ pedagogies ” like POGIL or PeerInstruction or active learning “ technologies ” likeclickers. We encourage instructors to think about thesechoices in terms of pedagogically and technology-basedactive learning “ tools. ” A tool should serve definite edu-cational purposes that are defined prior to use. As withany type of tool, procedural competence is necessary.However, as illustrated in this study, these tools can beused in several ways and their use can become moresophisticated with time.Recommendation 2: A tool-based orientation should gobeyond procedures and prescriptions for delivery. Ac-tive learning tools can cultivate disciplinary thinkingand sense-making processes that include conceptualreasoning, quantitative reasoning, and metacognitivethinking. Importantly, these processes can bootstrapone another towards deeper understanding (Veenman2012; Zimmerman 2000). Thus, in designing activity forstudents, instructors should consider how to progres-sively integrate the different types of sense-making pro-cesses to support one another towards doingdisciplinary work and building disciplinary understand-ing. Integration can be achieved either through a se-quence of activities as the engineering instructor did(i.e., conceptual reasoning with ARS followed by quan-titative reasoning with GIW) or within a single activityas the biology instructor did (i.e., conceptual reasoning,quantitative reasoning, and metacognitive thinking withPOGIL).Recommendation 3: Active learning tool use needs toaccount for course structure and context wheredeliberate choices support learning goals. The biologyinstructor enacted POGIL Fridays within a standardMWF lecture schedule. The engineering instructor hada split class on Wednesdays to support use of the ARStool for conceptual understanding and smaller studiosessions on Thursdays for guided inquiry. Instructorsshould think about their course structures and, ifpossible, work with administrators to adapt them forbetter alignment to the tools that supportinstructional goals. Recommendation 4: In using active learning tools topromote disciplinary sense-making, instructors of alllevels of experience should take a reflective and itera-tive view of their instructional practice . For example,both instructors studied here were acknowledged bystudents and their peers as excellent — acharacterization that was supported by the interviewdata. But, even so, they could reflect on ways to pos-sibly shift their activity with active learning tools to bet-ter align with learning goals. The biology instructormight push students towards conceptual reasoning withdelivery of ARS questions, and the engineering in-structor might modify his GIW with more emphasis onconceptual reasoning and metacognitive thinking. Ra-ther than viewing such changes in instruction inher-ently as a criticism of teaching prowess, instructorsshould view ongoing adjustments as a characteristic ofmasterful practice. Acknowledgements
The authors are grateful to RMC Research who conducted, audio-recorded,and transcribed the year 3 general reflective interviews, to Jana Bouwma-Gearhart who provided comments on an early version of the manuscript,and to the two instructors who kindly agreed to allow us insight into theirteaching practice.
Funding
This work was conducted with support from the National ScienceFoundation under grant DUE 1347817. Any opinions, findings, andconclusions or recommendations expressed in this material are those of theauthor and do not necessarily reflect the views of the National ScienceFoundation.
Availability of data and materials
The datasets generated and/or analyzed during the current study are notpublicly available because this is still an active project and a public releasewould violate the terms of our IRB approval. Some parts of the data set areavailable from the corresponding author on reasonable request.
Authors ’ contributions All authors made substantial contributions to the article and participated inthe drafting of the article. All living authors read and approved the finalmanuscript.
Ethics approval and consent to participate
This study has been approved by the Institutional Review Board (IRB) at theauthors ’ institute (study Competing interests
The authors declare that they have no competing interests.
Publisher ’ s Note Springer Nature remains neutral with regard to jurisdictional claims inpublished maps and institutional affiliations.
Author details School of Chemical, Biological, and Environmental Engineering, OregonState University, Corvallis, OR 97331, USA. College of Education, OregonState University, Corvallis, OR 97331, USA.
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 18 of 20 eceived: 31 December 2017 Accepted: 22 March 2018
References
Abraham, MR, & Renner, JW. (1986). The sequence of learning cycle activities inhigh school chemistry.
Journal of Research in Science Teaching , (2), 121 – ’ Biochemistryand Molecular Biology Education , (1), 1 – American Journal ofPhysics , (1), 31 – Computers & Education , , 102 – The PhilosophicalReview , (3), 303 – Journal of Engineering Education , (3), 185 – Academic Psychiatry , (5),401 – CBE-Life Sciences Education , (1), 9 – Science andChildren , (7), 28.Castillo-Manzano, JI, Castro-Nuño, M, López-Valpuesta, L, Sanz-Díaz, MT, Yñiguez,R. (2016). Measuring the effect of ARS on academic performance: A globalmeta-analysis. Computers & Education , , 109 – arXiv preprint arXiv:1704.05103 .Chi, MT, & Wylie, R. (2014). The ICAP framework: Linking cognitive engagementto active learning outcomes. Educational Psychologist , (4), 219 – Educational Research Review , , 1 – Educational Researcher , (7), 13 – Science , ,532 – Review of Educational Research , (2), 179 – Cognition, metacognition, andculture in STEM education: Learning, teaching and assessment , (vol. 24). Chas,Switzerland: Springer International Publishing AG.Douglas, EP, & Chiu, CC (2009). Use of guided inquiry as an active learningtechnique in engineering. In
Proceedings of the 2009 research in engineeringeducation symposium .Dreyfus, BW, Elby, A, Gupta, A, Sohr, ER. (2017). Mathematical sense-making inquantum mechanics: An initial peek.
Physical Review Physics EducationResearch , (2), 020141.Duncan, D (2005). Clickers in the classroom: How to enhance science teaching usingclassroom response systems . New York: Addison Wesley and Benjamin Cummings.Eberlein, T, Kampmeier, J, Minderhout, V, Moog, RS, Platt, T, Varma-Nelson, P,White, HB. (2008). Pedagogies of engagement in science.
Biochemistry andMolecular Biology Education , (4), 262 – Teaching of Psychology , (3), 147 – Journal of Engineering Education , (1), 138 – ’ performance on representativetasks. In The Cambridge handbook of expertise and expert performance ,(pp. 223 – Journal of Chemical Education , (4), 570.Felder, RM, & Brent, R. (2010). The National Effective Teaching Institute:Assessment of impact and implications for faculty development. Journal ofEngineering Education , (2), 121 – Journal of Science Education and Technology , (1), 101 – Metacognition in science education , (pp. 251 – Review of Research in Education , (1), 1 – Proceedings of the National Academyof Sciences , (23), 8410 – Science Educator , (2), 31.Geertz, C (1994). Thick description: Toward an interpretive theory of culture. In Readings in the philosophy of social science , (pp. 213 – Meta-analysis in social research . BeverlyHills: Sage.Grawe, N. (2016). Developing quantitative reasoning. Retrieved 19 December2017, from https://serc.carleton.edu/sp/library/qr/index.htmlHake, RR. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physicscourses.
American Journal of Physics , (1), 64 – — a new model forinstruction. Journal of Chemical Education , (1), 120.Hora, MT, Oleson, A, Ferrare, JJ (2013). Teaching dimensions observation protocol(TDOP) user ’ s manual . Madison: Wisconsin Center for Education Research.Hunsu, NJ, Adesope, O, Bayly, DJ. (2016). A meta-analysis of the effects ofaudience response systems (clicker-based technologies) on cognition andaffect. Computers & Education , , 102 – Computers &Education , (3), 819 – Journal of Research in ScienceTeaching , (9), 898 – Enhancing STEM Education at Oregon State University . Paperpresented at 2015 ASEE Annual Conference and Exposition, Seattle,Washington. https://doi.org/10.18260/p.24002Koretsky, MD, Brooks, BJ, Higgins, AZ. (2016). Written justifications to multiple-choice concept questions during active learning in class.
International Journalof Science Education , (11), 1747 – Advances in Engineering Education , (1),7:1 – ScienceEducation , (1), 32 – ‘ clickers ’ in the classroom: Teaching innovation ormerely an amusing novelty? Computers in Human Behavior , (4), 556 – “ Promoting active learning using theresults of physics education research. ” UniServe Science News 13 (1999)Lehrer, R. (2009). Designing to develop disciplinary dispositions: Modeling naturalsystems.
American Psychologist , (8), 759.Lesh, RA, & Doerr, HM (2003). Beyond constructivism: Models and modelingperspectives on mathematics problem solving, learning, and teaching . Mahwah,NJ: Lawrence Erlbaum.Lewis, SE. (2011). Retention and reform: An evaluation of peer-led team learning.
Journal of Chemical Education , (6), 703 – Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Journal of Chemical Education , (6), 703 – Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18
Page 19 of 20 ewis, SE, & Lewis, JE. (2005). Departing from lectures: An evaluation of a peer-ledguided inquiry alternative.
Journal of Chemical Education , (1), 135.Lewis, SE, & Lewis, JE. (2008). Seeking effectiveness and equity in a large collegechemistry course: An HLM investigation of peer-led guided inquiry. Journal ofResearch in Science Teaching , (7), 794 – Educational Technology Researchand Development , (2), 23 – Chemistry Education Research andPractice , (3), 187 – … Zhang, H.(2009). Clickers in college classrooms: Fostering learning with questioningmethods in large lecture classes.
Contemporary Educational Psychology , (1),51 – Peer instruction: A user ’ s manual . Upper Saddle River: PrenticeHall.National Research Council (1996). National science education standards .Washington, DC: National Academies Press.National Research Council (2000).
How people learn: Brain, mind, experience, andschool: Expanded edition . Washington, DC: National Academies Press.National Research Council (2011).
Learning science through computer games andsimulations . Washington, DC: National Academies Press.National Research Council (2012).
Discipline-based education research:Understanding and improving learning in undergraduate science andengineering . Washington, DC: National Academies Press.National Research Council (2013).
Next generation science standards: For states, bystates . Washington, DC: National Academies Press.Nelson, C, Hartling, L, Campbell, S, Oswald, AE. (2012). The effects of audienceresponse systems on learning outcomes in health professions education. ABEME systematic review: BEME guide no. 21.
Medical Teacher , (6), e386 – e405.Nicol, DJ, & Boyle, JT. (2003). Peer instruction versus class-wide discussion in largeclasses: A comparison of two interaction methods in the wired classroom. Studies in Higher Education , (4), 457 – ’ beliefs and intentions concerning teaching in higher education. HigherEducation , (4), 537 – … Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and theinquiry cycle.
Educational Research Review , , 47 – Audience response systems in highereducation: Applications and cases , (p. 187).Prince, M. (2004). Does active learning work? A review of the research.
Journal ofEngineering Education , (3), 223 – Chemical Engineering Education , (1), 52 – Research and Development in Higher Education , , 468 – Diffusion of innovations . New York: The Free Press.Russ, RS, & Odden, TOB. (2017). Intertwining evidence-and model-basedreasoning in physics sensemaking: An example from electrostatics.
PhysicalReview Physics Education Research , (2), 020105.Schoenfeld, AH (1987). What ’ s all the fuss about metacognition? In AHSchoenfeld (Ed.), Cognitive science and mathematics education , (pp. 198 – The structure of knowledge and the curriculum , (pp. 6 – Partners in innovation: Teaching assistants in college sciencecourses . Lanham, MD: Rowman & Littlefield.Speer, NM. (2008). Connecting beliefs and practices: A fine-grained analysis of acollege mathematics teacher ’ s collections of beliefs and their relationship tohis instructional practices. Cognition and Instruction , (2), 218 – ’ beliefs and conceptions: A synthesis of theresearch. In D Grouws (Ed.), Handbook of research on mathematics teachingand learning , (pp. 127 – Metacognition in science education , (pp. 21 – European Journal of Engineering Education , (4), 543 – Science Education , (1), 9 – ’ knowledge, reasoning, and argumentation. Journal of Research in ScienceTeaching , (3), 276 – … Tsourlidaki, E. (2015). Identifying potential types of guidance for supportingstudent inquiry when using virtual and remote labs in science: A literaturereview.
Educational Technology Research and Development , (2), 257 – Developmental Review , (1), 99 – Koretsky et al. International Journal of STEM Education (2018) 5:18(2018) 5:18