Exploring one aspect of pedagogical content knowledge of teaching assistants using the Conceptual Survey of Electricity and Magnetism
Exploring one aspect of pedagogical content knowledge of teaching assistantsusing the Conceptual Survey of Electricity and Magnetism
Nafis I. Karim, Alexandru Maries, and Chandralekha Singh Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, Pennsylvania 15260, USA Department of Physics, University of Cincinnati, Cincinnati, Ohio 45221, USA (Received 27 June 2017; published 4 April 2018)The Conceptual Survey of Electricity and Magnetism (CSEM) has been used to assess studentunderstanding of introductory concepts of electricity and magnetism because many of the items on theCSEM have strong distractor choices which correspond to students ’ alternate conceptions. Instruction isunlikely to be effective if instructors do not know the common alternate conceptions of introductoryphysics students and explicitly take into account common student difficulties in their instructional design.Here, we discuss research involving the CSEM to evaluate one aspect of the pedagogical contentknowledge of teaching assistants (TAs): knowledge of introductory students ’ alternate conceptions inelectricity and magnetism as revealed by the CSEM. For each item on the CSEM, the TAs were asked toidentify the most common incorrect answer choice selected by introductory physics students if they did notknow the correct answer after traditional instruction. Then, we used introductory student CSEM post-testdata to assess the extent to which TAs were able to identify the most common alternate conception ofintroductory students in each question on the CSEM. We find that the TAs were thoughtful whenattempting to identify common student difficulties and they enjoyed learning about student difficulties thisway. However, they struggled to identify many common difficulties of introductory students that persistafter traditional instruction. We discuss specific alternate conceptions that persist after traditionalinstruction, the extent to which TAs were able to identify them, and results from think-aloud interviewswith TAs which provided valuable information regarding why the TAs sometimes selected certain alternateconceptions as the most common but were instead very rare among introductory students. We also discusshow tasks such as the one used in this study can be used in professional development programs to engenderproductive discussions about the importance of being knowledgeable about student alternate conceptions inorder to help students learn. Interviews with TAs engaged in this task as well as our experience with suchtasks in our professional development programs suggest that they are beneficial. DOI: 10.1103/PhysRevPhysEducRes.14.010117
I. INTRODUCTIONA. Graduate teaching assistants
Graduate students in physics across the United Stateshave been playing an important role in educating the nextgeneration of students for a long time. In particular, in theU.S. it is quite common for physics graduate teachingassistants (TAs) to teach introductory physics recitation orlab sections which typically have lower enrollments thanthe “ lecture ” component of the course (20 –
40 compared to100 or more in a lecture). In addition to the graduate TAs,in the last two decades, undergraduate TAs [sometimesreferred to as learning assistants (LAs)] have also played a role in educating students by, e.g., assisting faculty mem-bers in teaching large classes. Appropriate professionaldevelopment of these TAs to help them perform their dutieseffectively is an important task. Physics education research-ers have been involved in research on identifying commonbeliefs and practices among physics TAs that have impli-cations for effective teaching [1 – ’ s perspective and think that ifthey know the material and can explain it to their students ina clear manner, it will be sufficient to help their studentslearn [1,3] [note that throughout this paper, “ student(s) ” refers to introductory physics student(s), and “ recitations orlabs ” refers to “ introductory physics recitations or labs ” ].Also, while graduate TAs are able to recognize usefulsolution features and articulate why they are importantwhen looking at sample student solutions provided to them,they do not necessarily include those features in their own Published by the American Physical Society under the terms ofthe Creative Commons Attribution 4.0 International license.Further distribution of this work must maintain attribution tothe author(s) and the published article ’ s title, journal citation,and DOI. PHYSICAL REVIEW PHYSICS EDUCATION RESEARCH = = = – –
16] followedby a quiz in the recitation, the TAs often have considerableflexibility in how to perform their recitation duties. Forexample, many instructors meet with the TA only briefly atthe beginning of the semester to outline general guidelines,e.g., answer student questions on the homework, solveproblems on the board, and the TAs are left to their owndevices for the rest of the semester, except for somecommunication with the course instructor via email orduring the grading of the exams. Thus, if TAs are knowl-edgeable about effective instructional approaches, they canmake a significant contribution to students ’ learning ofphysics in the recitations because they often have sufficientflexibility to lead the recitation in a manner that they thinkis conducive to student learning.To help TAs learn about effective pedagogy, manyinstitutions offer professional development programs whichare sometimes discipline specific [9,17 – ’ conceptions regarding students ’ difficulties [20]. For exam-ple, TAs may be largely unaware of certain student alternateconceptions. If professional development instructors pre-paring TAs discuss students ’ alternate conceptions andengage the TAs in discussions about how to help themlearn, the TAs may be better prepared to conduct theirteaching duties. It is even possible that in order to convincethe TAs, the professional development instructors may needto have TAs reflect upon quantitative data on studentperformance, which show that those alternate conceptionsare common. This type of activity in TA professionaldevelopment programs has the potential to enhance TAs ’ teaching effectiveness as they design, adopt, and adaptactivities to help students develop a robust knowledgestructure. Similarly, if TA professional developmentinstructors are aware that TAs know about certain studentalternate conceptions, those can only be discussed briefly.Thus, by focusing on what TAs know and do not knowand gradually building and strengthening different aspectsof their pedagogical content knowledge (PCK) [21,22](more about PCK in the next section), they can be guided to learn and implement effective pedagogies. These con-siderations motivated us to carry out the research studydiscussed here using the Conceptual Survey on Electricityand Magnetism (CSEM), which is one of the manyassessment tools often used to evaluate students ’ concep-tual understanding of introductory concepts [23]. The goalof the present study was to evaluate TAs ’ knowledge ofstudent alternate conceptions in electricity and magnetismas revealed by the CSEM. For each item on the CSEM, theTAs were asked to identify the most common incorrectanswer choice (MCI) selected by students after traditionalinstruction. This exercise was followed by a class dis-cussion with the TAs related to this task, including theimportance of knowing student difficulties and addressingthem effectively in order for learning to be meaningful. Wehave found that this type of activity in a TA professionaldevelopment course engenders a rich discussion aboutstudent difficulties and promotes the importance of think-ing about their difficulties from students ’ perspective inorder to bridge the gap between teaching and learning.More information about potential uses of this type ofactivity in TA professional development is provided inSection IV. B. Pedagogical content knowledge
There are several theoretical frameworks that inspireour research. These theoretical frameworks emphasize theimportance of instructors familiarizing themselves withstudents ’ common difficulties in order to scaffold theirlearning with appropriately designed curricula and pedag-ogies. In the context of this study, they point to theimportance of being knowledgeable about student difficul-ties in order to help students learn better. For example,Piaget [24] emphasized “ optimal mismatch ” betweenstudent ideas and instructional design for cognitive conflictand desired assimilation and accommodation of knowl-edge, and others have put forth similar ideas [25].Being knowledgeable about student alternative concep-tions related to a particular topic and using them asresources in instructional design is one aspect of whatShulman defined as pedagogical content knowledge(PCK) [21,22]. Shulman defines PCK as the subject matterknowledge for teaching . In other words, PCK is a formof practical knowledge used by experts to guide theirpedagogical practices in highly contextualized settings.Shulman writes “ Within the category of pedagogicalcontent knowledge, I include [ … ] the most useful formsof representation of those ideas, the most powerfulanalogies, illustrations, examples, explanations, anddemonstrations — in a word, the ways of representingand formulating the subject that make it comprehensibleto others. ” In addition, according to Shulman, PCK alsoincludes “ an understanding of what makes the learningof specific topics difficult, ” or, in other words, knowledgeof the common difficulties that students have in learningKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
There are several theoretical frameworks that inspireour research. These theoretical frameworks emphasize theimportance of instructors familiarizing themselves withstudents ’ common difficulties in order to scaffold theirlearning with appropriately designed curricula and pedag-ogies. In the context of this study, they point to theimportance of being knowledgeable about student difficul-ties in order to help students learn better. For example,Piaget [24] emphasized “ optimal mismatch ” betweenstudent ideas and instructional design for cognitive conflictand desired assimilation and accommodation of knowl-edge, and others have put forth similar ideas [25].Being knowledgeable about student alternative concep-tions related to a particular topic and using them asresources in instructional design is one aspect of whatShulman defined as pedagogical content knowledge(PCK) [21,22]. Shulman defines PCK as the subject matterknowledge for teaching . In other words, PCK is a formof practical knowledge used by experts to guide theirpedagogical practices in highly contextualized settings.Shulman writes “ Within the category of pedagogicalcontent knowledge, I include [ … ] the most useful formsof representation of those ideas, the most powerfulanalogies, illustrations, examples, explanations, anddemonstrations — in a word, the ways of representingand formulating the subject that make it comprehensibleto others. ” In addition, according to Shulman, PCK alsoincludes “ an understanding of what makes the learningof specific topics difficult, ” or, in other words, knowledgeof the common difficulties that students have in learningKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, ’ s previous research on the reasoning processesof physicians [26], which he found to be domain specificand contrary to the general assumption that certainphysicians possess a general trait of diagnostic acumenwhich makes them better diagnosticians than others.Shulman generalized this observation to conclude thatgood teachers not only possess domain specific knowl-edge, but also possess more practical knowledge aboutteaching that is domain specific (i.e., PCK). Shulmantherefore encouraged research on teachers ’ PCK and thetypes of teacher preparation programs that are likely toimprove and/or develop teachers ’ PCK. Since Shulmanintroduced the concept of PCK, much has been writtenabout it [27 – “ four general areas of teacher knowledge[which are] the cornerstones of the emerging work onprofessional knowledge for teaching: general pedagogicalknowledge, subject matter knowledge, pedagogical contentknowledge, and knowledge of context ” and argues thatPCK (as opposed to their subject matter knowledge)generally has the greatest impact on teachers ’ classroomactivities. Others have also stressed the importance of PCKin shaping instructional practice and discuss professionaldevelopment programs which take PCK into account[36,37]. For example, Borko and Putman [37] describethe Cognitively Guided Instruction Project, a multiyearprogram of curriculum development, professional develop-ment, and research that has shown “ powerful evidence thatexperienced teachers ’ pedagogical content knowledge andpedagogical content beliefs can be affected by professionaldevelopment programmes. ” Given the importance of PCK in shaping instructionalpractices, it is not surprising that researchers haveattempted to document teachers ’ PCK [31,33,34] andothers have attempted to document the development ofteachers ’ PCK [35,38]. However, these tasks are challeng-ing to carry out for multiple reasons such as the fact thatmuch of the knowledge teachers have of their practice istacit [39,40], or the fact that although there is a generalconsensus among researchers on PCK as a construct, itsboundaries are not clearly delineated [41]. Also, extendedobservations are needed in order to recognize whenteachers ’ PCK is instantiated in their practice [31]. Toovercome some of these challenges, researchers have oftenused multimethod approaches to investigate teachers ’ PCK.For example, observational data are not sufficient because ateacher may use only a small portion of the representationsthey have at their disposal. In addition, observations do notprovide insight into teachers ’ instructional decisions — wesee what they are doing, but do not know why. Partly due tothese issues, Loughran et al. [31] used both classroom observations and follow-up interviewing of teachers. Theinterviews encouraged teachers to articulate their knowl-edge and explored alternative representations that theteachers did not use during the teaching sessions. Thisinvestigative approach is quite time consuming to bothcarry out and analyze since both the observations andinterviews provide lengthy qualitative data which requirecoding and analysis. Baxter and Lederman [42] provide areview of methods and techniques for studying PCK andthe subject matter knowledge of teachers.Partly due to all of the difficulties in carrying out aninvolved investigation of PCK, we developed a relativelystraightforward method for delving into one particularaspect of PCK, namely, knowledge of student difficultieswith particular topics. This method makes use of stand-ardized conceptual multiple-choice tests developed byphysics education researchers and quantitative data fromstudents taking these tests. Teachers are provided with acopy of a particular test (e.g., CSEM), and for each item onthe test they are asked to select what they expect would bethe MCI selected by students after traditional instruction ina relevant topic. Then, quantitative student data are used toquantify the extent to which teachers are knowledgeableabout common student difficulties revealed by students ’ MCIs after traditional instruction. Previous research withK-12 teachers [20] found that on items which have a strongdistractor (e.g., MCI), there is a large difference in learninggains between students taught by teachers who couldidentify the alternate conceptions and students taught byteachers who could not. Therefore, it is valuable to explorethe extent to which teachers are knowledgeable aboutstudent alternate conceptions on items drawn from well-designed standardized tests.Two prior research studies conducted with TAs usingthe method described in the preceding paragraph used theForce Concept Inventory (FCI) [43,44] and Test ofUnderstanding Graphs in Kinematics (TUG-K) [45,46].The main findings from these studies are as follows: (cid:129)
TAs were able to identify student MCIs in certaincontexts, but struggled to identify them in othercontexts. For example, for the FCI, 84% of theTAs identified students ’ alternate conception relatedto Newton ’ s 3rd law in the typical context (carcolliding with truck), but only 40% of the TAsidentified it in a less typical context (car pushingtruck and speeding up). (cid:129) TAs sometimes expected certain answer choices to bethe MCIs, when instead those answer choices wereselected by very few students. (cid:129)
Think-aloud interviews with TAs engaged in the taskof determining students ’ MCIs suggested that the TAswere reflective and often had reasonable thoughtsregarding how students may be reasoning about thequestions. Interviews also suggested that the TAs weresometimes distracted by certain answer choices thatEXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
Think-aloud interviews with TAs engaged in the taskof determining students ’ MCIs suggested that the TAswere reflective and often had reasonable thoughtsregarding how students may be reasoning about thequestions. Interviews also suggested that the TAs weresometimes distracted by certain answer choices thatEXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14, ’ knowledge of MCIs can be useful in designing effectiveprofessional development programs. II. METHODOLOGYA. Participants
The participants in this study were 81 first year graduatestudents (three separate cohorts) enrolled in a semester longmandatory pedagogy oriented TA professional develop-ment course at Pitt, which meets once a week for two hours.The graduate student population at Pitt is consistent withthat of a typical research-based state university. The TAsteach recitations and labs, typically in a traditional manner.In the recitations, the TAs primarily answer student ques-tions, solve problems on the board and give students aquiz in the last 10 –
20 min. In the labs, the TAs start bydemonstrating the procedures needed for that lab and thestudents closely follow the detailed procedures written inthe lab manual.Since this is the first and last pedagogy-oriented semesterlong course most physics graduate TAs at Pitt will evertake, it is designed to help them become more effectiveteachers in general. During the course, they get a generaloverview of cognitive research and PER during 1 two-hoursession and discuss with each other and reflect upon theirinstructional implications. The TAs are also introduced tocurricula and pedagogies based on PER which emphasizethe importance of being knowledgeable about students ’ difficulties in order to help them develop expertise inphysics. Each week, TAs complete various reflectiveexercises designed to help them perform their TA dutiesin a student-centered manner. For example, in one class,they discuss how to write effective problem solutions forphysics classes and what features should be included insolutions they hand out to students and why [4 – ’ difficulties directly in their interactions withstudents.In addition to the quantitative study, we conductedthink-aloud interviews [48] with 11 TAs. Because of theavailability of the TAs for individual interviews, four of theinterviewed TAs had participated in the quantitative study(they were in the TA professional development course inwhich the quantitative study was carried out) but sevenwere not. We also note that for the TAs who participated inthe quantitative study, at least one year had passed beforethey were interviewed. Thus, the questions in the CSEMPCK task carried out in the TA professional developmentcourse were not fresh in their mind at the time of theinterviews. Each of the 11 TAs had at least one semester ofteaching experience in recitations. We did not find anyqualitative differences in the reasoning of the TAs whetherthey had participated in the quantitative study earlier or not.More details about the interviews are provided in Sec. II C. B. Materials
The materials used in this study are the CSEM, whichwas given to the TAs in the TA professional developmentcourse as explained below, the post-instruction algebra-based students ’ data (printed in Ref. [23]) that werecollected over a period of four years from an average of388 students, the quantitative data and the interview dataobtained from the TAs. These data were used to determinestudents ’ MCIs on each item on the CSEM, to assess TAknowledge of student alternate conceptions, and to under-stand the reasoning TAs use when selecting certain incor-rect answers as the MCIs.
C. Methods
In the quantitative study, the TAs were provided with theCSEM and, for each item on the CSEM, they were asked toKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
In the quantitative study, the TAs were provided with theCSEM and, for each item on the CSEM, they were asked toKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, after traditional instruction if they did not know thecorrect answer (rather than before instruction), because itwas considered that it is more important for TAs to beaware of alternate conceptions that persist after traditionalinstruction. Also, our previous research suggests that bothTAs and instructors [1,44] are often reluctant to contem-plate students ’ conceptions before instruction. We note thatit does not make a significant difference whether thequestion is phrased about students ’ difficulties with eachquestion in the post-test or pretest because the MCIs ofstudents rarely changed after traditional instruction [23].Also, an analysis of the pre- and post-test data in Ref. [23]for each item on the CSEM suggests that the percentage ofstudents who had a certain alternate conception eitherdecreased after instruction or remained roughly the same.Since we asked the TAs to identify the alternate concep-tions after traditional instruction, we performed dataanalysis using the post-test data in Ref. [23].In years two and three of the study, the researchers alsoasked TAs to predict the percentage of students who wouldanswer each question on the CSEM correctly in a post-testafter traditional instruction in relevant concepts. We inves-tigated TA data from each year separately and found veryfew differences between the different years. Therefore, allthe data were combined (for TAs ’ predictions on thepercentage of students answering each question correctly,only years two and three were combined because thisquestion was not asked during the first year). Each year,after the TAs completed the CSEM-related PCK task, therewas a full class discussion about the tasks and whyknowledge of common student difficulties is critical forteaching and learning to be effective in general. The TAswere not prompted to explain their reasoning for theirchoices in written responses, but in the class discussion,certain items on the CSEM were discussed in detail andTAs discussed their reasoning for why they expectedcertain incorrect answer choices to be MCIs of students.In order to obtain an in-depth account of TAs ’ reasoning(related to why they expected certain answer choices to bethe MCIs), think-aloud interviews were conducted with 11TAs. Certain questions were selected from the CSEM theTAs were asked to think aloud and (i) identify the correctanswer, and (ii) determine the MCI of introductory studentsfor each question. They were not disturbed during this timeunless they became quiet for a long time in which casethey were asked to keep talking. After discussing all ofthe questions selected by the interviewer, if time permitted,the TA was sometimes asked to look back at some ofthe questions and provide more details about why they expected a particular incorrect answer choice to be theMCI if their reasoning was not clear enough when thinkingaloud without being disturbed. The main goal of theinterviews was to identify possible reasons why TAsexpected that certain answer choices would be the MCIswhen in fact those answer choices were not common.Thus, the quantitative data collected were used to identifyquestions in which this may be occurring and the interviewsfocused on those questions. For example, on Q2 on theCSEM, roughly half the TAs expected that choice D wouldbe the MCI, but this answer choice was only selected by11% of students (see Table I).In order to obtain a quantitative measure of TAs ’ performance at identifying the alternate conceptions ofstudents, scores were assigned to each TA. A TA whoselected a particular incorrect answer choice as the MCI ina particular question received a PCK score which was equalto the fraction of students who selected that particularincorrect answer choice. If a TA selected the correct answerchoice as the MCI (a rare occurrence), their data wereremoved only for that specific question because they wereexplicitly asked to indicate the incorrect answer choicewhich is most commonly selected by students if they didnot know the correct answer after traditional instruction inrelevant concepts. For example, on question 1, the per-centages of algebra-based students who selected A, B, C,D, and E are 4%, 63%, 23%, 7%, and 3%, respectively(as shown in Table I). Answer choice B is correct, thus, thePCK score assigned to TAs for each answer choice if theyselected it as the MCI would be 0.04, 0, 0.23, 0.07, and0.03 (A, B, C, D, and E). The total PCK score a TA wouldobtain on the task for the entire CSEM can be obtained bysumming over all of the questions (this is referred to as “ CSEM-related PCK score ” ). More details on how this wasdone are provided in the Supplemental Material [49].We note that the approach used to determine the CSEM-related PCK score weighs the responses of TAs by thefraction of students who selected a particular incorrectresponse. This weighting scheme was chosen because themore prevalent a student difficulty is, the more important itis for a TA to be aware of it and take it into account intheir instruction. Furthermore, this approach also providesa reasonable PCK score when there is more than onecommon alternate conception. For example, if a questionhas two incorrect answer choices that are commonlyselected by students, e.g., Q29 in which 26% of studentsselected A and 23% selected B (both incorrect). If all theTAs selected choice A as the MCI, their PCK score wouldbe 100%, but if half the TAs selected A and half select B,their PCK score would be 92.5%.It is important to clarify that PCK score is only one metricof TAs ’ performance at identifying students ’ MCIs. In orderto get a clear picture of TAs ’ performance, one needs to lookat the percentages of TAs who selected each incorrect answerchoice as well as how common those answer choices were.EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
In the quantitative study, the TAs were provided with theCSEM and, for each item on the CSEM, they were asked toKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, after traditional instruction if they did not know thecorrect answer (rather than before instruction), because itwas considered that it is more important for TAs to beaware of alternate conceptions that persist after traditionalinstruction. Also, our previous research suggests that bothTAs and instructors [1,44] are often reluctant to contem-plate students ’ conceptions before instruction. We note thatit does not make a significant difference whether thequestion is phrased about students ’ difficulties with eachquestion in the post-test or pretest because the MCIs ofstudents rarely changed after traditional instruction [23].Also, an analysis of the pre- and post-test data in Ref. [23]for each item on the CSEM suggests that the percentage ofstudents who had a certain alternate conception eitherdecreased after instruction or remained roughly the same.Since we asked the TAs to identify the alternate concep-tions after traditional instruction, we performed dataanalysis using the post-test data in Ref. [23].In years two and three of the study, the researchers alsoasked TAs to predict the percentage of students who wouldanswer each question on the CSEM correctly in a post-testafter traditional instruction in relevant concepts. We inves-tigated TA data from each year separately and found veryfew differences between the different years. Therefore, allthe data were combined (for TAs ’ predictions on thepercentage of students answering each question correctly,only years two and three were combined because thisquestion was not asked during the first year). Each year,after the TAs completed the CSEM-related PCK task, therewas a full class discussion about the tasks and whyknowledge of common student difficulties is critical forteaching and learning to be effective in general. The TAswere not prompted to explain their reasoning for theirchoices in written responses, but in the class discussion,certain items on the CSEM were discussed in detail andTAs discussed their reasoning for why they expectedcertain incorrect answer choices to be MCIs of students.In order to obtain an in-depth account of TAs ’ reasoning(related to why they expected certain answer choices to bethe MCIs), think-aloud interviews were conducted with 11TAs. Certain questions were selected from the CSEM theTAs were asked to think aloud and (i) identify the correctanswer, and (ii) determine the MCI of introductory studentsfor each question. They were not disturbed during this timeunless they became quiet for a long time in which casethey were asked to keep talking. After discussing all ofthe questions selected by the interviewer, if time permitted,the TA was sometimes asked to look back at some ofthe questions and provide more details about why they expected a particular incorrect answer choice to be theMCI if their reasoning was not clear enough when thinkingaloud without being disturbed. The main goal of theinterviews was to identify possible reasons why TAsexpected that certain answer choices would be the MCIswhen in fact those answer choices were not common.Thus, the quantitative data collected were used to identifyquestions in which this may be occurring and the interviewsfocused on those questions. For example, on Q2 on theCSEM, roughly half the TAs expected that choice D wouldbe the MCI, but this answer choice was only selected by11% of students (see Table I).In order to obtain a quantitative measure of TAs ’ performance at identifying the alternate conceptions ofstudents, scores were assigned to each TA. A TA whoselected a particular incorrect answer choice as the MCI ina particular question received a PCK score which was equalto the fraction of students who selected that particularincorrect answer choice. If a TA selected the correct answerchoice as the MCI (a rare occurrence), their data wereremoved only for that specific question because they wereexplicitly asked to indicate the incorrect answer choicewhich is most commonly selected by students if they didnot know the correct answer after traditional instruction inrelevant concepts. For example, on question 1, the per-centages of algebra-based students who selected A, B, C,D, and E are 4%, 63%, 23%, 7%, and 3%, respectively(as shown in Table I). Answer choice B is correct, thus, thePCK score assigned to TAs for each answer choice if theyselected it as the MCI would be 0.04, 0, 0.23, 0.07, and0.03 (A, B, C, D, and E). The total PCK score a TA wouldobtain on the task for the entire CSEM can be obtained bysumming over all of the questions (this is referred to as “ CSEM-related PCK score ” ). More details on how this wasdone are provided in the Supplemental Material [49].We note that the approach used to determine the CSEM-related PCK score weighs the responses of TAs by thefraction of students who selected a particular incorrectresponse. This weighting scheme was chosen because themore prevalent a student difficulty is, the more important itis for a TA to be aware of it and take it into account intheir instruction. Furthermore, this approach also providesa reasonable PCK score when there is more than onecommon alternate conception. For example, if a questionhas two incorrect answer choices that are commonlyselected by students, e.g., Q29 in which 26% of studentsselected A and 23% selected B (both incorrect). If all theTAs selected choice A as the MCI, their PCK score wouldbe 100%, but if half the TAs selected A and half select B,their PCK score would be 92.5%.It is important to clarify that PCK score is only one metricof TAs ’ performance at identifying students ’ MCIs. In orderto get a clear picture of TAs ’ performance, one needs to lookat the percentages of TAs who selected each incorrect answerchoice as well as how common those answer choices were.EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
D. Research goal: How well do TAs predict students ’ responses to the CSEM after instruction? In order to investigate this research goal, we analyzeddata pertaining to the following: (cid:129)
Alternate conceptions which many TAs expected to becommon, which were instead rare among students. (cid:129)
Alternate conceptions which were common amongstudents but were not identified very well by TAs. (cid:129)
Alternate conceptions that were common amongstudents, which the majority of TAs were able toidentify. (cid:129)
Qualitative results from detailed think-aloud inter-views with 11 TAs that focused on what common
TABLE I. Questions on the CSEM, percentages of algebra-based students who answered the questions correctly in a post-test,percentages of students who selected each incorrect answer choice ranked from most to least common, the percentage of TAs whoselected each incorrect answer choice as the MCI, and average PCK score. To help make the table easier to interpret, answer choicesselected by 20% or more students are written in red font. The same answer choices are also written in red font for the TAs. Note that thedata were taken from Ref. [23] and the number of students who answered each question varies from 158 to 444. With the exception offour questions, all questions were answered by more than 350 students in the post-test.
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, (cid:129)
The extent to which TAs were able to predict thedifficulty of the questions.We note the following about the interviews: in general,during the interviews, the TAs were reflective and some-times thought back to when they were teaching recitationsthemselves. In some of the questions, they were able toidentify the MCI and had good ideas about the commondifficulties of students. However, an important goal of theinterviews was to identify the reasoning the TAs commonlyuse when they select answer choices that were not verycommon among students. Therefore, the discussion focus-ing on this aspect in a particular question should notbe taken as an indication that the interviewed TAs did apoor job at identifying common alternate conceptions ofstudents on those questions.
III. RESULTS
We note that the MCIs of students are similar for bothalgebra-based and calculus-based classes (see Ref. [23]).Therefore, the researchers performed the analysis of theCSEM-related PCK performance with the student datafrom algebra-based classes in Ref. [23] as discussed below.
A. Performance of TAs in identifying students ’ alternate conceptions related to the CSEM Many questions on the CSEM reveal a common studentalternate conception [23]. Analysis of the CSEM-relatedPCK score of the TAs was conducted on each of thesequestions and the results are displayed in Table I. Table Ishows all CSEM items, the percentages of students whoanswered each question correctly, the percentages ofstudents who selected each incorrect answer choice rankedfrom most to least common, the incorrect answer choicesmost commonly selected by TAs (as the MCIs), and thepercentages of TAs who selected these answer choices.Correct answers are indicated by the green shading inTable I, and incorrect answer choices selected by 20% ormore students are indicated by the red font. Table I alsoshows the average CSEM-related PCK scores of the TAs.
B. Results relevant to the research goal
In this section, we group questions together based on theconcepts involved. We discuss questions in which few TAsidentified a MCI as well as questions in which the TAsperformed quite well in identifying the MCI.
1. Charge distribution on conductors andinsulators (Q1, Q2)
Q1 and Q2 ask about what happens to an excess chargeplaced at some point P on a conducting (Q1) or insulatinghollow sphere (Q2). For Q1, students ’ MCI was that the charge distributes everywhere on the inside and outside ofthe metal sphere (choice C, roughly one-fourth of thestudents). This implies that students may be thinking thatthe positive charges spread as far from each other aspossible [23]. More than half of the TAs identified thischoice showing that they are aware that students maystruggle with the fact that charges on a metallic sphere aredistributed only on the outer surface in equilibrium. On Q2,students had two alternate conceptions: that the chargedistributes itself everywhere on the outside of the sphere,i.e., not distinguishing between insulating and conducting(choice B) and that there will be no charge left (choice E) — roughly one-fifth of the students selected each. On both Q1and Q2, many TAs expected that the MCI is choice D forboth questions, namely, that most of the charge is at point P , but some of it will spread over the sphere. On Q1, someof the TAs reasoned that choice D would be the MCIbecause students would expect that the charges wouldmove, but that there is not enough force to move all thecharges everywhere around the sphere, or that it takes morethan a few seconds for the charge to spread everywhere andtherefore some will remain at point P . For example, oneinterviewed TA stated: “ They [students] don ’ t expect thatfor a metal [there is] enough push in order to move all thecharges from that point [P]. ” Another interviewed TAmotivated his selection of choice D as the MCI by stating: “ Most people probably think it ’ s D [ … ] because they mightnot recognize that it has to be an instantaneous distributionof charge. So, they recognize that the charge will have tospread over the surface, and since we know it ’ s metal, I ’ massuming they understand a conductor won ’ t have chargeon the inside. It [charge] is all gonna be on the surface, butthey might assume that the majority of the charge hasn ’ tfully distributed yet. ” On Q2, TAs ’ most common reason-ing for selecting choice D was that it was the incorrectanswer choice that is most similar to the correct one andthat students may have some understanding that an insu-lating sphere is different from a conducting sphere, butwould not fully understand it. For example, one TA said: “ If they understand this is insulating material [i.e., they donot miss this information when reading the question],they will choose D [ … ] because they know somethingabout insulating that it is not like the conducting, but they[may not know] that the charge will stay at the position[ P ]. ” This seems reasonable; however, it appears that fewstudents selected this answer choice.
2. Coulomb ’ s force law (Q3, Q4, Q5, Q6, Q7, Q8) Q3, Q4, and Q5 are related and shown in Fig. 1. On Q3,three-fourths of the students realize that the force on the þ Q charge should increase by a factor of 4. On Q3 and Q6the largest percentage of students who selected an incorrectanswer choice was 13%, so it does not seem like there arepersistent alternate conceptions on these questions.EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
2. Coulomb ’ s force law (Q3, Q4, Q5, Q6, Q7, Q8) Q3, Q4, and Q5 are related and shown in Fig. 1. On Q3,three-fourths of the students realize that the force on the þ Q charge should increase by a factor of 4. On Q3 and Q6the largest percentage of students who selected an incorrectanswer choice was 13%, so it does not seem like there arepersistent alternate conceptions on these questions.EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14, þ Q to þ Q , themagnitude of the force on it remains the same, F (instead ofincreasing by a factor of 4 to F ). This suggests that theymight have the alternate conception that the electric forceon a charge is only proportional to the charge that isapplying the force [23]. Students may also not recognizethat Newton ’ s 3rd law applies (i.e., the electric forceexerted on the þ Q charge by the þ Q charge has thesame magnitude as the electric force exerted on the þ Q charge by the þ Q charge). This alternate conception wasselected by 57% of the TAs.Q5 asks students what happens to the magnitude ofthe force when the charges are moved to be 3 times as farapart. One fifth of the students who selected choice C onQ4 thought that the force will now decrease by a factor of3 and selected choice B on Q5, while a smaller percent-age (14%) thought that the force will decrease by a factorof 9 (correct thinking, but incorrect conclusion becausethe force on the þ Q charge is initially F not F ). Inother words, the MCI is that when the two charges aremoved three times as far apart, the force on themdecreases by a factor of 3. If the TAs are aware thatthis is the MCI, then among the TAs who selected choiceC on Q4, many of them should select choice B on Q5.However, while 57% of the TAs selected choice C on Q4,only one-sixth of the TAs identified choice B as the MCIon Q5, and one-third selected choice A, possibly becausechoice A is a combination of a correct idea (forcedecreases by a factor of 9) and an incorrect one (forceon þ Q charge before increasing the distance betweenthe two charges is F ).On Q5, roughly one-fourth to one-fifth of the studentsselected option D and option B, each. The students whoselected either of these two options are likely to think thatthe electric force is inversely proportional to the distance (instead of distance squared), so that when the separationbetween two charges is tripled, the force between themdecreases by a factor of 3. So, if a student answers F= on Q5, they probably thought that the force decreased by afactor of 3, and the original force was F (Q4). If instead,a student answers F= on Q5, that student probablythought that the original force was F . Half of the TAsidentified option D as the MCI, whereas less than one-sixth of the students selected option B. On Q4 and Q5, theTAs identified the MCIs quite well.On Q7, there is one strong distractor (choice C, morethan 40% of students); another choice (A) is selected byone-fifth of the students. Nearly identical percentages ofTAs (between 40% and 45%) selected choices C and A,respectively, as the MCI. However, choice C is muchmore common than choice A among students. Q8provides students with the two situations depicted inFig. 2 and states that in the configuration on the left,charges q and q are positive and that the net force actingon q as the result of its interaction with the two chargespoints in the positive x direction (to the right). Thequestion asks what happens to the force acting on q when another positive charge ( þ Q ) is placed at thelocation shown in the configuration on the right. TheMCI (choice D) selected by roughly one-fifth of studentsis that the force will increase and its direction may changedue to the interaction between Q and charges q and q .While 43% selected this as the MCI, more than one-fourth of the TAs selected choice E, which states thatthe answer cannot be determined without knowing themagnitude of q and/or Q . However, this answer choicewas selected by less than 10% of students. In interviews,some of the TAs also selected choice E as the MCI.One interviewed TA, for example, motivated selectingchoice E by stating: “ I think most of them [students] willgo with E [ … ] because they might think that F is kq q divided by r [squared] and then they think, ‘ ok, nothing is[given], q is not [given], r is not [given] ’ , then they cannotdecide [what happens to] the force. ” It appears that someof the TAs think that students may remember the equationfor the electric force acting between two charges, butsince none of the information is explicitly given (i.e.,by providing values for the charges and distances), theelectric force cannot be calculated so the question cannotbe answered. However, it appears that very few studentsmay be reasoning this way since less than 10% selectedthis answer choice.
FIG. 2. Figure provided for Q8 on the CSEM.FIG. 1. Questions 3, 4, and 5 on the CSEM.
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, . Connection between electric field and electricforce (Q10, Q12, Q15)
Q10 states that a positive charge is released from rest in auniform electric field and asks about its subsequent motion.The two MCIs are that the charge remains at rest (choice E,selected by one-fourth of the students and less than 10% ofTAs) and that it will move at constant velocity (choice B,selected by one-fifth of the students). In the interviews, TAswere explicitly asked whether they expected that studentswould harbor the alternate conception of choosing choiceE. Nearly all the interviewed TAs said that it is unlikely thatstudents do not know that charges placed in an electricfield would move and, thus, the interviews highlightedhow challenging it is for TAs to identify this alternateconception.Answer choice A is similar to choice B except that it saysthat the charge moves at a constant speed instead ofconstant velocity and only 6% of the students selectedthis answer choice. However, one-fourth of TAs selected it.During the interviews, some of the TAs who selected thisanswer choice did not seem to consider B very carefully.For example, one TA stated: “ They might think it will go atconstant speed because the field is uniform so the effect isconstant throughout the path. ” However, a more commonoccurrence in the interviews was for TAs to consider bothchoices A and B as the MCIs and either say they are notsure which one is more common or that students wouldselect among these two answer choices equally. It ispossible that in the quantitative study conducted in theTA professional development course, TAs had similarconsiderations and some TAs opted for choice A whileothers opted for choice B as the MCI. However, as shown inTable I, much fewer students selected choice A comparedto choice B. There are two other questions in this grouping:Q12, which does not appear to have any common alternateconceptions (largest percentage of students who selected anincorrect answer choice is 13%) and Q15 (shown in Fig. 3)on which the TAs performed well at identifying them (91%of the TAs selected one of the two alternate conceptions onthis question). However, on Q15, one-fourth of studentsexpected that the electric force points directly towards thepositive charge from which all the field lines originate(choice B), but only one-sixth of the TAs identified this as the MCI. This is likely due to another alternate conceptioncommon amongst more students (roughly one-third),namely, that the electric force points to the right neglectingto incorporate the sign of the charge. The vast majorityof the TAs (roughly three-fourths) identified this morecommon alternate conception.
4. Induced charge and electric field orforce (Q13, Q14)
Q13 and Q14 provide students with the diagrams shownin Fig. 4. In Q13, the sphere is hollow and conducting andhas an excess positive charge on its surface. The questionasks for the direction of the electric field at the center of thesphere. In Q14, the sphere is also hollow and conducting,but it has no excess charge, and the question asks about theforces acting on the two charges.On both of these questions, the students ’ MCI is to notrecognize that the conducting sphere alters the electric fieldor forces. Thus, on Q13, roughly one-fourth of themselected choice A on which the electric field is to the leftwhich does not incorporate the effect of the metal sphere onthe electric field (as though the sphere is not present).Roughly half of the TAs selected option A as the MCI, thussuggesting that they are aware that students have difficultyrecognizing how conducting objects respond to the externalelectric field (i.e., free charge moves in order to make theelectric field inside the conductor zero).On Q14, roughly half of the TAs selected choice A whichsays that the forces the two charges feel are the same (onceagain, as though the sphere does not affect the forces). Onthis question, roughly half the TAs selected other answerchoices (B, C, and E), which combined were selected byonly one-fourth of students. In the interviews, TAs whoselected choice A as the MCI on Q13 usually did sobecause they expected that students would only think aboutthe electric field caused by the þ Q charge and ignore themetal sphere. For example, one TA who selected A said: “ Maybe someone would say leftward because they think ofthe positive being the source so they think of it making a[field] line and the [field] line is going outward from thecharge, and they think it ’ s just going to go straight throughthe sphere. ” On the other hand, on Q14, this same TA saidthat students ’ MCI would be choice E because they maythink that the charge distribution on the sphere affects theforces that the two charges are experiencing. “ They mightthink that little q at the center of the sphere [ … ] is feeling FIG. 4. Diagrams provided for Q13 (a) and Q14 (b) on theCSEM.FIG. 3. Q15 on the CSEM.
EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14, Q here might feel forcefrom this guy [ q ] and all the surface charges [on thesphere]. ” Other interviewed TAs cited similar reasoning forselecting choice E.On Q13, another TA selected choice A as the MCI andstated that students may ignore the effect of the sphere.When looking at Q14, this TA explicitly mentioned hisprevious answer and stated: “ My thought is similar to thelast one, to kind of just ignore the sphere. So, A, maybe. ” Inother words, students may ignore the effect of the sphereand select A. But after noticing choice E, he changed hismind and went on to say the following: “ I think E [may bethe MCI] because they might realize that the sphere does dosomething to change things, so they think ‘ ok, I know [theforces would normally be] equal and opposite, but nowthere ’ s a sphere here, so I don ’ t know exactly how thatworks ’ [i.e., what the effect of the sphere is] so they ’ ll justthrow in something [i.e., include some effect due to thesphere], so E is that something. ” It appears that this TA wasaware that students may be guided by similar incorrectthinking (conducting sphere will not have an effect) on Q14as on Q13, but on Q14, selected the answer choice whichincorporates a correct idea (conducting sphere has aneffect), but is missing another idea in order to be fullycorrect. In many other questions, TAs often selected answerchoices which fit this category. For example, as mentionedearlier, on Q1, some TAs thought that students would selectchoice D, which states that some of the charge does spreadover the sphere — a partially correct answer. Similarly, onQ2, some TAs selected the same answer, which is partiallycorrect because some of the charge does remain at point P .They also sometimes explicitly noted that they wereselecting this answer choice as the MCI because it is theone that is most similar to the correct answer. On Q10,many interviewed TAs considered answer choices A and B,stating they expected that students would be aware that thecharge should move, but they may not know that it moveswith a constant acceleration (more examples will bediscussed below). While sometimes using this strategy toidentify the MCI may provide a reasonable answer choice(i.e., one that is fairly common among students), it oftenmisled the TAs into selecting an answer choice that was notvery common — as was the case on Q1, Q2, and Q14 (andother questions that are discussed below). On Q14, forexample, this reasoning led some TAs to select choice E asthe MCI. However, this answer choice was only selected byless than one-seventh of students.On Q14, one interviewed TA who selected choice B asthe MCI to Q14 noted that students may reason in thefollowing way: “ Inside the conductor there is no field. Butthey might think the sphere is shielding the field due to theinside charge also. So, everything is shielded and there is noforce [i.e., neither þ q nor þ Q experience a force]. ” OtherTAs who selected choice B used very similar reasoning. Similar to TAs ’ reasoning for selecting choice E discussedearlier, choice B also incorporates a partially correct idea:the metal sphere “ protects ” the charge inside from the effectof outside charges, which is partly why many interviewedTAs selected it as the MCI.
5. Connection between electric potential and electricfield or force (Q16, Q18, Q19, Q20)
Q16 states that an electron is placed at a position on the x axis where the electric potential is equal to þ V and asksabout the subsequent motion of the electron. The students ’ responses are spread over the four incorrect choices almostevenly. Roughly one-fourth of students thought that theelectron would move towards the right (the MCI), but thisanswer choice was the one least likely to be selected by theTAs (less than one-sixth selected it). One interviewed TAthought that the students will place the electron on thepositive x axis and a positive charge at the origin of thecoordinate axis (to give concreteness to the situation) andclaim that the electron would move to the left.Q18 relates to the three situations shown in Fig. 5 andasks students to compare the magnitude of the electric fieldat point B in all three cases. Here, more than one-fourth ofstudents selected E which states that the electric fields areequal. These students only considered that the equipotentialline on which B lies is at 40 V and did not recognize that itis the change in electric potential (i.e., gradient) that isrelated to the magnitude of the electric field rather thanthe potential itself. Just over half the TAs identified thisdifficulty.Q19 asks students for the direction of the electric forceacting on a positive charge if placed at point A or B insituation III. One-quarter of students selected choice B(right at point A and right at point B ), possibly because “ right ” is the direction in which the electric potentialincreases and they expected that a positive charge wouldbe pushed in that direction [23]. This alternate conceptionwas identified by 61% of the TAs. On Q20, the TAs appearto be able to identify the alternate conceptions.
6. Work or electric potential energy (Q11, Q17)
Q11 asks what happens to the electric potential energy ofa positive charge after being released from rest in a uniformelectric field. Students ’ MCI is that it remains constant
FIG. 5. Diagram provided for Q17 – Q19 on the CSEM.
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, “ It (the charge) hasan acceleration and velocity is increasing right? So, they[students] may think that the potential [energy] shouldincrease because velocity is increasing. ” Another inter-viewed TA who selected choice C as the MCI selected itfor a very similar reason. On Q17, students are asked tocompare the work needed to move a positive charge frompoint A to point B in three different situations (shown inFig. 5). More than one-fifth of the students answeredthat the most work is done when moving the charge insituation III. These students likely thought that the work ismaximum in situation III because the distance over whichthe charge is moved is largest and did not consider thepotential difference between the two points. Many TAs(63%) identified this alternate conception. On Q17, 81%of the TAs selected one of the two MCIs, thus theyperformed well at identifying the alternate conceptionson this question.
7. Force on or motion of charged particle in a magneticfield (Q21, Q22, Q25, Q27)
Q21 asks what happens to a positive charge that is placedat rest in a magnetic field. Students ’ MCI is that the chargemoves in a circle at constant speed (choice C selected byroughly one-fifth of students). On this question, many TAsthought that students may confuse electric and magneticfield and thereby conclude that the charge moves withconstant acceleration (choice B selected by 31% of TAs),but this answer choice is very rarely selected by students(less than 10% of them selected this choice). For example,one interviewed TA stated, “ I can see people confusing oressentially just ignoring that it ’ s a magnetic field thinkingthat it should do the same thing as it does in an electricfield, so, constant acceleration. Yea, that would be myguess — B, they would think that it would do the same thingit does in an electric field. ” Q22 provides the diagram shown in Fig. 6 and asks forthe direction of the magnetic field responsible for makingthe electron path curve in the way shown. More than one-fourth of the students selected “ into the page ” which wouldbe correct if the electron was positively charged and 59% ofthe TAs identified this difficulty. Also, 22% of the studentsselected upward, suggesting that they may think that thedirection of the magnetic force is the same as the directionof the magnetic field, less than one-fourth of the TAsidentified this alternate conception. Q25 provides the three situations shown in Fig. 7 of apositive charge moving in an external magnetic field andasks students to rank them according to the magnitude ofthe magnetic force. TAs ’ selections are quite varied, with asignificant percentage of them opting for each incorrectanswer choice, suggesting they had difficulty identifyingthe MCIs.Interviews suggest that TAs struggled to identify theMCI, which is that the force is largest in situation II (wherethe charge moves “ against ” the magnetic field) and least insituation III (where the charge moves “ with ” the electricfield), and situation II is in between — choice C selected byone-fifth of students. TAs had difficulty determining howstudents may reason about this question incorrectly. TheTAs sometimes opted for choice A (same force in allsituations) because they expected that some students mayonly recall qvB as the magnetic force on a charge movingin a magnetic field and thus conclude that the forces areequal in the three situations. If they did not select thisanswer choice, they usually started by stating that when thevelocity and magnetic field are in the same direction,students may think that this leads to the largest force. Forexample, one TA stated: “ They [students] are thinking “ oh,the magnetic field is pushing it along in this direction andit ’ s already moving in that direction ” so that ’ s just com-pounding the effect (i.e., force is largest in situation III). ” Other interviewed TAs reasoned in a similar way, butafter concluding that students may think the force is largestin situation III, they had difficulty applying the samereasoning to situations I and II. They sometimes statedthat for situation II, students may think that the accelerationis least because the charge is moving in a direction (partly)opposite to the magnetic field and conclude that the force isleast in situation II (and select B). Other TAs stated thatperhaps students are somehow thinking of the dot productinstead of the cross product and conclude that choice E isthe MCI. Yet other TAs, after considering situation II,changed their minds because they thought that since thecharge is moving against the magnetic field, students maythink that the field is exerting the largest force. This wasone of the questions on the CSEM which took the TAs themost time to answer (i.e., determine what they expectedwould be the MCI). One TA, after trying to figure it out for
FIG. 6. Diagram provided for Q22 on the CSEM.
EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14, ’ alternate con-ception on this question.Q27 shows a positive charge placed at rest near twomagnets, the one on the left being 3 stronger than the oneon the right (see Fig. 8). It asks for the magnetic forceacting on the charge and provides the answer choicesshown in Fig. 8. On this question, the MCIs are choice A( ∼ one-fifth) and choice D ( ∼ one-fourth). While 46% of theTAs selected choice A, only 12% of the TAs selected choiceD, and one-fourth selected choice C which is selected byless than 10% of students. One interviewed TA selectedchoice C because he expected students to think that themagnet on the left is pushing the charge towards the rightand the magnet on the right is pushing the charge towardsthe left. When asked why he expected students to think thisway he stated that he did not know how to explain it, it wasjust his gut feeling based on his experience teachingrecitations.
8. Magnetic field caused by a current (Q23, Q26, Q28)
On Q26 (shown in Fig. 9), students ’ MCI is that themagnetic field is radially outward from the wire (choice D,one-fifth of the students). On this question, roughly half ofthe TAs selected choice C in which the direction of themagnetic field is opposite to the correct direction (i.e.,clockwise instead of counterclockwise), but only 6% ofstudents selected this answer choice. All the interviewedTAs who selected this answer choice essentially said thatstudents may either use their left hand or use the right-hand rule incorrectly. However, the choices selected by studentsdo not suggest this as a major difficulty.On Q26, some interviewed TAs used similar reasoningas some of the TAs who selected choice E on Q14 — students have some correct ideas (try to use the right-handrule), but are not fully correct (obtain the incorrectdirection). It is important to point out that after recognizingthat students may be answering the question incorrectly forthis reason (which does not seem to be common), theinterviewed TAs did not consider all the other answerchoices carefully and did not realize that students may haveother alternate conceptions, namely, that the magnetic fieldwould be radially outward from the wire (i.e., confusionbetween electric and magnetic field). After the TAsanswered all the other questions in the interview, theywere often asked to return to this question and think aboutwhether they expected that any students would selectchoice D (radially outward magnetic field). After beingasked to consider this answer choice explicitly, they wereoften able to recognize the alternate conception guidingstudents to select choice D and some interviewed TAswanted to change their original answer. Similarly to Q14,some TAs attempted to identify common alternate con-ceptions on Q26 by arguing that students may have somecorrect ideas, but miss something that causes them to nothave the fully correct answer. However, it appears that forthis question (and others mentioned earlier), this type ofreasoning from the TAs often steered them in the wrongdirection and caused them to identify an answer choice thatis not common among students while missing the MCI.On Q28 (shown in Fig. 10), the loops carry currents ofequal magnitude and the question asks for the direction ofthe magnetic field at point P . The MCI is that the two FIG. 8. Physical situation and answers provided for Q27 on theCSEM. FIG. 9. Diagram and answer choices for Q26 on the CSEM.FIG. 7. Three situations and answer choices provided in Q25 on the CSEM.
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, P is zero (choice E), selected byroughly one-third of students. These students likely thoughtthat the magnetic fields created by the two loops are inopposite directions and they therefore cancel [23]. Thisalternate conception was identified by 55% of the TAs, butone-third of them also selected choice A (selected by lessthan 10% of students). Similarly to Q26 discussed above,all of the TAs who selected this answer choice duringinterviews claimed that students may use the right-handrule incorrectly and obtain the incorrect direction, however,it appears that very few students do this.
9. Faraday ’ s law (Q29, Q30, Q31, Q32) Q29 asks students to identify all of the situations shownin Fig. 11 in which the light bulb is glowing. Roughly one-fourth of students only selected situations I and IV in whichthere is relative motion between the magnet and the loop.These students did not recognize that in situation II, theelectric flux is changing (because the area of the loop ischanging) and therefore there will be an induced emf(electromotive force) in the loop (light bulb glows).Roughly half the TAs identified this alternate conception.Furthermore, roughly one-fourth of the students alsoselected situation III (i.e., answered that the light bulbglows in situations I, III and IV, choice A), sometimes dueto overgeneralizing that there is an induced emf in any situation in which the loop is moving, while one-fourth ofthe TAs identified this alternate conception.On Q29 and Q30, 77% and 81% of the TAs identifiedone of the two MCIs in each question, while on Q31,students seem to be randomly selecting from the fourincorrect answer choices.Q32 is one of the most challenging questions on theCSEM (less than one-fifth of the students answered itcorrectly). The question and answers are shown in Fig. 12.On this question, the MCI is choice B (selected by 40% ofstudents). The corresponding alternate conception is thatthe reading on the voltmeter opposes the reading in theammeter (i.e., reading on the ammeter increases, thereforereading on the voltmeter decreases and vice versa). Thestudents may be trying to apply Lenz ’ s law, but may notrealize that the induced emf opposes the change in fluxrather than the flux itself. It indicates that they have a lot ofdifficulty recognizing that the induced emf in the secondarycoil is only nonzero when the current in the primary coil ischanging. However, it appears that many TAs are unawareof this difficulty. Roughly 10% of TAs identified thisalternate conception. On the other hand, 31% of the TAsselected choice E, but only 1% of students selected thischoice. In the interviews, one TA selected this choice, FIG. 11. Diagram provided for Q29 on the CSEM.FIG. 10. Diagram and answer choices for Q28 on the CSEM. FIG. 12. Q32 on the CSEM.
EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
10. To what extent are TAs able to predict thedifficulty of the questions?
Figure 13 shows TAs ’ average predictions of thedifficulty of each question on the CSEM, i.e., thepercentage of students who answered each questioncorrectly (TAs ’ predictions) as well as the actual difficultyof each question (National Data in Ref. [23]). Figure 13shows that the TAs underestimated the average difficultyof the majority of the questions on the CSEM. Thediscrepancy between TAs ’ predicted difficulty and theactual difficulty is quite large for some questions, inparticular, the questions that were most difficult forstudents (e.g., Q14, Q20, Q24, Q29, Q31, Q32).Figure 13 also shows that TAs ’ predicted difficulty doesnot fluctuate very much: with the exception of only fivequestions, the TAs ’ predicted difficulty is between 45%and 65% for all the questions on the CSEM, thusindicating that the TAs did not have a good sense ofhow difficult the questions are from the perspective ofstudents. This conclusion is further supported by averag-ing TAs ’ predictions over all questions and comparingthem to the actual average difficulty: TAs overpredictedstudents ’ performance on the CSEM by 15% on average. IV. USING A PCK TASK AS APEDAGOGICAL TOOL
Many TAs explicitly noted that the CSEM-related PCKtask was challenging and it was difficult for them to thinkabout the difficulty of the physics questions from a student ’ sperspective. In the think-aloud interviews, graduate studentssometimes made comments which indicated that they foundthe task challenging (e.g., explicitly commenting “ I don ’ tknow students well enough …” ). However, many TAs noted that the CSEM-related PCK task was worthwhile and helpedthem think about the importance of putting themselves intheir students ’ shoes in order for teaching and learning to beeffective, especially after receiving student data on howstudents actually performed and discussing particular studentalternate conceptions.Our interviews suggest that if such a task is used for TAprofessional development, it is best for teaching assistantsto be explicitly told to first try to identify (and perhaps writedown) what alternate conceptions or incorrect reasoningmay lead students to select each of the incorrect answerchoices before deciding which one is the MCI. In think-aloud interviews, we found that when TAs were explicitlyprompted to consider all alternative answer choices andarticulate why a student may select each, they were veryreflective and often managed to identify the MCI. If there isnot sufficient time for this process during the professionaldevelopment activity, either the TAs can be asked toperform this task as homework before the discussion duringthe professional development activity or the professionaldevelopment leaders can select a subset of questions thatwould be most productive for discussion based upon thestudy described here. For example, our data suggest that Q2would be a good question to discuss. First, the TAs shouldbe asked to identify the most common incorrect answerchoice (after they are either provided with the correctanswer or they are asked to identify it), and our quantitativedata suggests that most TAs will select D as the MCI (onlyselected by 11% of students). However, other TAs willselect B which is a common alternate conception. Afterworking on this task, the TAs could be asked to convinceone another that their choice is actually the MCI ofstudents, and the professional development leader canguide the discussion. Finally, the TAs could be shownthe student data and asked to reflect upon it. There are manyother questions on the CSEM in which a significant fractionof TAs selected an answer that is not common, while otherTAs selected the MCI, e.g., questions Q4, Q7, Q8, Q11,Q14, Q17, Q21, Q24, Q25, Q26, Q27, Q28, Q32. Thesetypes of questions can be used in a similar manner to whatis discussed above. FIG. 13. Comparison of the percentages of correct answers predicted by TAs with algebra-based students ’ actual performance aftertraditional instruction as obtained from Ref. [23]. Standard deviations range between 17.7 and 24.6 and are not shown for clarity. KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
Many TAs explicitly noted that the CSEM-related PCKtask was challenging and it was difficult for them to thinkabout the difficulty of the physics questions from a student ’ sperspective. In the think-aloud interviews, graduate studentssometimes made comments which indicated that they foundthe task challenging (e.g., explicitly commenting “ I don ’ tknow students well enough …” ). However, many TAs noted that the CSEM-related PCK task was worthwhile and helpedthem think about the importance of putting themselves intheir students ’ shoes in order for teaching and learning to beeffective, especially after receiving student data on howstudents actually performed and discussing particular studentalternate conceptions.Our interviews suggest that if such a task is used for TAprofessional development, it is best for teaching assistantsto be explicitly told to first try to identify (and perhaps writedown) what alternate conceptions or incorrect reasoningmay lead students to select each of the incorrect answerchoices before deciding which one is the MCI. In think-aloud interviews, we found that when TAs were explicitlyprompted to consider all alternative answer choices andarticulate why a student may select each, they were veryreflective and often managed to identify the MCI. If there isnot sufficient time for this process during the professionaldevelopment activity, either the TAs can be asked toperform this task as homework before the discussion duringthe professional development activity or the professionaldevelopment leaders can select a subset of questions thatwould be most productive for discussion based upon thestudy described here. For example, our data suggest that Q2would be a good question to discuss. First, the TAs shouldbe asked to identify the most common incorrect answerchoice (after they are either provided with the correctanswer or they are asked to identify it), and our quantitativedata suggests that most TAs will select D as the MCI (onlyselected by 11% of students). However, other TAs willselect B which is a common alternate conception. Afterworking on this task, the TAs could be asked to convinceone another that their choice is actually the MCI ofstudents, and the professional development leader canguide the discussion. Finally, the TAs could be shownthe student data and asked to reflect upon it. There are manyother questions on the CSEM in which a significant fractionof TAs selected an answer that is not common, while otherTAs selected the MCI, e.g., questions Q4, Q7, Q8, Q11,Q14, Q17, Q21, Q24, Q25, Q26, Q27, Q28, Q32. Thesetypes of questions can be used in a similar manner to whatis discussed above. FIG. 13. Comparison of the percentages of correct answers predicted by TAs with algebra-based students ’ actual performance aftertraditional instruction as obtained from Ref. [23]. Standard deviations range between 17.7 and 24.6 and are not shown for clarity. KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, “ Now, let ’ s think aboutwhich one is more common. Do you expect that C is morecommon, or D, or perhaps similar percentages of studentsselect either choice? Discuss with each other and predict thepercentage of students who select C or D in the post-test. ” Furthermore, our qualitative data provide reasons theTAs sometimes use when selecting certain answer choicesas the MCIs when those answer choices are not actuallycommon among students. For example, on Q26, TAs oftenselect the answer choice in which the direction of theelectric field is opposite to the correct one, motivating theirchoice by saying that perhaps students will use the right-hand rule incorrectly, or use their left hand. In a profes-sional development class, after the TAs mention thisreasoning, the professional development leader can askthem to think of any other incorrect reasoning students mayuse. If the TAs struggle, they can be directed to think aboutanswer choice D, and our interviews suggest that the TAswill likely be able to figure out that students may think themagnetic field radiates outward (similar to electric field).After this they can be asked what they expect would be themost common incorrect reasoning used by students andagain, our qualitative interviews suggest that the TAs willlikely identify the latter incorrect reasoning (field radiatesoutward) as more common than the former (using the lefthand, or using the right-hand rule incorrectly).In addition, our research suggests that the TAs areusually thoughtful when thinking aloud about this PCKtask, thus, it will be useful for the TAs to reflect upon thistask with their peers during the TA development activityeven if they did not manage to identify the MCIs of studentsvery well.We note that two of the authors (A. M. and C. S.) havebeen using tasks similar to the one described here in theprofessional development of TAs at their institutions andhave found them to be very useful in setting the stage for adiscussion on the importance of being aware of students ’ difficulties and alternate conceptions in order to designinstruction to help students learn. The TAs discuss ques-tions which have been carefully selected to engenderproductive discussions among TAs as discussed above.The TAs are explicitly asked to identify and discuss witheach other what reasoning students may use to select each incorrect answer choice before making a decision aboutwhich one is the MCI. Additionally, since only a subset ofquestions is selected, there is more time for the TAs to alsospend predicting the difficulty of each question. After theTAs complete the task, they are shown data from students,and some TAs explicitly express that it is very valuable forthem to learn about the common student difficulties inconcrete contexts. We found that TAs tend to trust studentdata more than statements like “ research has found that …” The discussion is then focused on how TAs can identifycommon student difficulties related to various physicsconcepts, e.g., by listening to students when reasoningabout physics and coming up with guiding questions in realtime to develop a grasp of how students are thinking inspecific contexts. At one of the institutions (University ofCinncinnati), the rest of the professional developmentprogram (which meets once a week for a semester) isfocused on the tutorials students work on and their commondifficulties on specific questions on the tutorials, as well aseffective approaches the TAs can use to help studentsdevelop a coherent knowledge structure of those physicsconcepts. Using such tasks with actual data from studentsfor TA professional development can be effective at otherinstitutions as well.
V. DISCUSSION AND SUMMARY
Awareness of students ’ common difficulties and beingable to understand how challenging certain concepts are forstudents are important aspects of pedagogical contentknowledge. One can take advantage of knowledge ofstudents ’ common difficulties and use them as resourcesto design effective pedagogical approaches to help studentslearn better [44,46,50]. Our investigation used the CSEM toevaluate this aspect of pedagogical content knowledge inthe context of electricity and magnetism for 81 TAs whowere all first-year physics graduate students enrolled in aTA professional development course. For each item on theCSEM, the TAs were asked to identify what they expect isthe MCI of introductory students after traditional instruc-tion. Additionally, in years two and three of the study, theTAs were also asked to estimate the difficulty of eachquestion on the CSEM. In all three years there was an in-class discussion with the TAs related to the PCK task.Additionally, think-aloud interviews were conducted toobtain an in-depth account of what reasoning TAs use toarrive at the conclusion that certain alternate conceptionsmay be common.
1. General approach often used by the TAs to identifycommon incorrect answer choices of students
When trying to decide what answer choices would becommon among students, TAs often selected answerchoices which incorporate both correct and incorrect ideas.While this approach was sometimes productive in helpingEXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
When trying to decide what answer choices would becommon among students, TAs often selected answerchoices which incorporate both correct and incorrect ideas.While this approach was sometimes productive in helpingEXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
2. TAs struggled to identify alternate conceptionsregarding how charge distributes onconductors and insulators
There are two questions on the CSEM that ask whathappens to a charge placed at a particular point on aconducting or insulating sphere. For both questions, manyTAs selected answer choices that were not common amongstudents. On the question in which the sphere is insulating,nearly half the TAs expected that students would think thatmost of the charge remains where it was placed, but somedoes spread over the sphere. Interviews suggested that theTAs selected this answer choice because it is the choicewhich is most similar to the correct answer (charge remainswhere it was placed), i.e., the TAs used the same strategywe described above in other contexts.
3. TAs struggled to identify alternate conceptionsregarding the magnetic field caused by a current
On both questions related to magnetic field caused by acurrent for which there was a common alternate conception,the TAs selected answer choices which are not at allcommon among students. On both questions TAs ’ oftenselected answer choices in which the right-hand rule wasused incorrectly, but very few students selected thoseanswer choices.
4. TAs struggled to identify alternate conceptionsregarding the motion of or force on a chargedparticle in a magnetic field
Out of the four questions dealing with the concept ofLorenz force (Q21, Q22, Q25, Q27), only on one of them(Q22) did the majority of TAs identify the MCIs. On theother ones, the TAs often selected answer choices that were not common. Also, Q25 was one of the most challengingquestions for the TAs; in interviews they often spent aconsiderable amount of time trying to figure out howstudents may answer the question and sometimes evenended up essentially guessing, or committing to an answeronly after being asked to select one.
5. Alternate conceptions held by very few studentswhich TAs expected would be the MCIs
There were multiple instances in which TAs selectedcertain incorrect answer choices which they thought wouldbe MCIs among students, but those answer choices werevery rarely selected by students. Three such examples arepresented in the preceding paragraphs and there are manyothers. For example, on Q21, 31% of the TAs expected thatstudents would confuse the magnetic field with an electricfield and think that the charge will move at constantacceleration, but only 8% of students selected this answerchoice and on Q32, 31% of the TAs selected choice E, butonly 1% of the students selected this option.
6. Alternate conceptions that the TAs wereable to identify
The TAs performed reasonably well at identifyingalternate conceptions related to Coulomb ’ s force law(Q3 – Q8), although there is room for improvement, espe-cially on Q8. On Q3 and Q6, there are no strong alternateconceptions, and on Q4, Q5, and Q7 the majority of theTAs identified the common alternate conceptions. Q8 is theonly one on which the TAs could improve significantly, andthis is the only question of the group that has a complicatedsetup and asks students to compare two configurations sideby side, one with three charges and the other with four. It ispossible that TAs ’ lower performance in identifying theMCIs on this question was due to the setup being morecomplicated than those used in the other questions. TAsperformed reasonably well in identifying the alternateconception that the electric field inside a hollow metallicsphere due to an external charge is the same as it would bewithout the hollow metal sphere. In other words, TAs wereaware that students have difficulty understanding that theinside of a metallic sphere is shielded from outside electricfields. On two other questions involving Faraday ’ s law andLenz ’ s law (Q29 and Q30), TAs performed well inidentifying students ’ alternate conceptions.
7. TAs ’ ability to predict the difficulty of thequestions on the CSEM Our results also suggest that the TAs typically under-estimated the difficulty of the questions on the CSEM,especially on the challenging questions. For all but fivequestions on the CSEM, TAs ’ average predictions for thepercentage of students who answer the questions correctlywere between 45% and 65%, while the actual percentagesKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
7. TAs ’ ability to predict the difficulty of thequestions on the CSEM Our results also suggest that the TAs typically under-estimated the difficulty of the questions on the CSEM,especially on the challenging questions. For all but fivequestions on the CSEM, TAs ’ average predictions for thepercentage of students who answer the questions correctlywere between 45% and 65%, while the actual percentagesKARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14, ’ s perspective.
8. Using a PCK task as a pedagogical tool
We have been using a PCK task as a pedagogical toolin our semester-long professional development programs,and the data collected in this study (as well as our earlierstudies of PCK), has helped design effective discussionsabout the importance of being knowledgeable of studentdifficulties. Certain questions are selected for differentreasons. For example, questions in which the quantitativedata suggest that TAs identify the MCIs can be used tobuild confidence and help TAs recognize that they areknowledgeable of certain ways in which students reason.For other questions, the quantitative data suggest that TAsselect two or more answer choices as most common, andthese questions can lead to productive discussions as theTAs try to convince one another that a particular answerchoice is more common than another, or that two answerchoices are likely going to be selected by similarpercentages of students. Quantitative student data shouldalways be shared to help convince the TAs that certainincorrect answer choices are very common among stu-dents, and our experience with using a PCK task as apedagogical tool has shown that the TAs generallyappreciate learning about student difficulties in thismanner.
9. Comparison to prior studies related to TAs ’ PCK for multiple choice assessments
Comparison with our earlier studies of PCK [44,46,51]using FCI and TUG-K suggests that the PCK task maybe more challenging when the assessment used is theCSEM compared to other assessments related to forceand motion or kinematics. One potential reason for thismay be the difference between the topics of mechanics(including kinematics) and electricity and magnetism: ourdaily experience with the real world leads to a relativelypredictable (Aristotelian) world view and TAs could moreeasily reason their way to common misconceptions heldby students. Electricity and magnetism, on the otherhand, deals with concepts that are not primarily learnedexperientially (e.g., charges, fields, and currents), which likely makes it more difficult to predict the MCIs ofstudents. We note, however, that whether the context iselectricity and magnetism, force and motion, kinematics,or quantum mechanics, whether intuitive or not, studentdifficulties can be classified in a few categories [52,53].Knowing the types of incorrect reasoning students engagein for a particular context can be used as resources indesigning instruction to help students develop a robustknowledge structure [53].Despite the differences mentioned above, there are manycommonalities in the three PCK studies. In all of thesestudies, there are questions on which TAs ’ performance atidentifying common student difficulties is good, whilethere are also questions in which TAs struggled to identifystudent difficulties. Both interviews and the quantitativedata show that it was often the case that TAs selectedanswer choices that are not very common among students.In interviews, they sometimes considered different answerchoices and struggled to select the MCI, sometimes onlydoing so after being reminded that they should try toidentify the MCI. We note that with the goal of improvingTAs ’ PCK, in the future we plan to write up a lesson planfor using concept inventories as part of the TA professionaldevelopment program and share it via a physics teachingsupport website [54].Finally, our earlier studies using the TUG-K and FCIshowed that the ability to identify common students ’ alternate conceptions was not dependent on familiaritywith U.S. teaching practices and that TAs exhibitedcomparable performance in identifying students ’ alternateconceptions for the FCI or TUG-K regardless of whetherthey obtained their undergraduate degree in the U.S. orelsewhere. Therefore, we did not explicitly compare thePCK performance of TAs with different institutional back-grounds in this study. However, informal observationsduring the TA professional development course as wellas interviews suggest that the CSEM related PCK perfor-mance of these TAs (e.g., from China vs US) is likely to becomparable. ACKNOWLEDGMENTS
We are grateful to NSF for Grant No. DUE-1524575 aswell as all the TAs who participated in the studies. [1] C. Singh, Categorization of problems to assess andimprove proficiency as teacher and learner, Am. J. Phys. , 73 (2009).[2] D. Meltzer, The relationship between mathematics prepa-ration and conceptual learning gains in physics: A possible “ hidden variable ” in diagnostic pretest scores, Am. J. Phys. , 1259 (2002); A. J. Mason and C. Singh, AssessingExpertise in Introductory Physics Using Categoriza-tion Task, Phys. Rev. ST Phys. Educ. Res. , 020110(2011). EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
We are grateful to NSF for Grant No. DUE-1524575 aswell as all the TAs who participated in the studies. [1] C. Singh, Categorization of problems to assess andimprove proficiency as teacher and learner, Am. J. Phys. , 73 (2009).[2] D. Meltzer, The relationship between mathematics prepa-ration and conceptual learning gains in physics: A possible “ hidden variable ” in diagnostic pretest scores, Am. J. Phys. , 1259 (2002); A. J. Mason and C. Singh, AssessingExpertise in Introductory Physics Using Categoriza-tion Task, Phys. Rev. ST Phys. Educ. Res. , 020110(2011). EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,
3] C. Singh, Rethinking tools for training teaching assistants,AIP Conf. Proc. , 59 (2009).[4] S. Y. Lin, C. Henderson, W. Mamudi, E. Yerushalmi, andC. Singh, Teaching assistants ’ beliefs regarding examplesolutions in introductory physics, Phys. Rev. ST Phys.Educ. Res. , 010120 (2013).[5] E. Yerushalmi, C. Henderson, W. Mamudi, C. Singh, andS. Y. Lin, The group administered interactive question-naire: An alternative to individual interviews, AIP Conf.Proc. , 97 (2012).[6] S. Y. Lin, C. Singh, W. Mamudi, C. Henderson, and E.Yerushalmi, TA-designed vs. research-oriented problemsolutions, AIP Conf. Proc. , 255 (2012).[7] E. Yerushalmi, E. Marshman, A. Maries, C. Henderson,and C. Singh, Grading practices and considerations ofgraduate students at the beginning of their teaching assign-ment, Proceedings of the Physics Education Conference2014, Minneapolis, MN , edited by P. Engelhardt, A.Churukian, and D. Jones (2015), p. 287.[8] C. Henderson, E. Marshman, A. Maries, E. Yerushalmi,and C. Singh, Instructional goals and grading practices ofgraduate students after one semester of teaching experi-ence,
Proceedings of the Physics Education Conference2014, Minneapolis, MN , edited by P. Engelhardt, A.Churukian, and D. Jones (2015), p. 111.[9] F. Lawrenz, P. Heller, and R. Keith, Training the teachingassistant: Matching TA strengths and capabilities to meetspecific program goals, J. Coll. Sci. Teach. , 106 (1992).[10] P. Heller, R. Keith, and S. Anderson, Teaching problemsolving through cooperative grouping. 1. Group vs indi-vidual problem solving, Am. J. Phys. , 627 (1992).[11] P. Heller and M. Hollabaugh, Teaching problem solvingthrough cooperative grouping. 2. Designing problems andstructuring groups, Am. J. Phys. , 637 (1992).[12] C. Singh, Impact of peer interaction on conceptual testperformance, Am. J. Phys. , 446 (2005).[13] C. Singh, Effectiveness of group interaction on conceptualstandardized test performance, Proceedings of the PhysicsEducation Conference 2002, Boise, ID , edited by S.Franklin, K. Cummings, and J. Marx (2002), p. 67.[14] A. J. Mason and C. Singh, Helping students learn effectiveproblem solving strategies by reflecting with peers, Am. J.Phys. , 748 (2010).[15] A. Mason and C. Singh, Impact of guided reflection withpeers on the development of effective problem solvingstrategies and physics learning, Phys. Teach. , 295(2016).[16] A. J. Mason and C. Singh, Using reflection with peers tohelp students learn effective problem solving strategies,AIP Conf. Proc. , 41 (2010).[17] Recruiting and educating future physics teachers: casestudies and effective practices , edited by C. Sandifer and E.Brewe (American Physical Society, College Park, MD,2015).[18] C. Henderson, E. Marshman, R. Sayer, C. Singh, and E.Yerushalmi, Graduate teaching assistants use differentcriteria when grading introductory physics vs. quantummechanics problems,
Proceedings of the 2016 PhysicsEducation Research Conference, Sacramento, CA (2016),p. 140. [19] E. Yerushalmi, R. Sayer, E. Marshman, C. Henderson, andC. Singh, Physics graduate teaching assistants ’ beliefsabout a grading rubric: Lessons learned, Proceedings ofthe Physics Education Conference 2016, Sacramento, CA (2016), p. 408.[20] P. M. Sadler, G. Sonnert, H. P. Coyle, N. Cook-Smith, andJ. L. Miller, The influence of teachers ’ knowledge onstudent learning in middle school physical science class-rooms, Am. Educ. Res. J. , 1020 (2013).[21] L. S. Shulman, Those who understand: Knowledge growthin teaching, Educ. Res. , 4 (1986).[22] L. S. Shulman, Knowledge and teaching: Foundations ofthe new reform, Harv. Educ. Rev. , 1 (1987).[23] D. Maloney, T. O ’ Kuma, C. Hieggelke, and A.Van Heuvelen, Surveying students ’ conceptual knowledgeof electricity and magnetism, Am. J. Phys. , S12 (2001);L. Ding, R. Chabay, B. Sherwood, and R. Beichner,Evaluating an electricity and magnetism assessment tool:Brief electricity and magnetism assessment, Phys. Rev. STPhys. Educ. Res. , 010105 (2006).[24] H. Ginsberg and S. Opper, Piaget ’ s Theory of IntellectualDevelopment (Prentice Hall, Englewood, 1969).[25] G. J. Posner, K. A. Strike, P. W. Hewson, and W. A.Gertzog, Accommodation of a scientific conception:Toward a theory of conceptual change, Sci. Educ. ,211 (1982).[26] A. S. Elstein, L. S. Shulman, and S. Sprafka, MedicalProblem Solving: The Analysis of Clinical Reasoning (Harvard University Press, Cambridge, MA, 1978).[27] J. H. van Driel, N. Verloop, and W. de Vos, Developingscience teachers ’ pedagogical content knowledge, J. Res.Sci. Teach. , 673 (1998).[28] P. L. Grossman, The Making of a Teacher: TeacherKnowledge and Teacher Education (Teachers CollegePress, New York 1990).[29] P. L. Grossman, What are we talking about anyhow:Subject matter knowledge for secondary English teachers,in
Advances in Research on Teaching, Vol. 2:Subject Matter Knowledge , edited by J. Brophy (JAI Press,Greenwich, CT, 1991), pp. 245 – ExaminingPedagogical Content Knowledge (Kluwer Academic,Boston, 2001).[31] J. Loughran, P. Mulhall, and A. Berry, In search ofPedagogical Content Knowledge in science: Developingways of articulating and documenting professional prac-tice, J. Res. Sci. Teach. , 370 (2004).[32] H. Borko and R. T. Putnam, Expanding a teacher ’ s knowl-edge base: A cognitive psychological perspective onprofessional development, in Professional Developmentin Education: New Paradigms and Practices , T. R. Guskeyand M. Huberman (Teachers College Press, New York1995).[33] C. L. Ebert, An assessment of prospective secondaryteachers ’ pedagogical content knowledge about functionsand graphs, Paper presented at the Annual Meeting of theAmerican Educational Research Association, Atlanta, GA,USA (1993).
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
KARIM, MARIES, and SINGH PHYS. REV. PHYS. EDUC. RES.14,
34] A. N. Geddis, B. Onslow, C. Beynon, and L. Oesch,Transforming content knowledge: Learning to teach aboutisotopes, Sci. Educ. , 575 (1993).[35] J. H. Van Driel, O. De Jong, and N. Verloop, The develop-ment of preservice chemistry teachers ’ pedagogical contentknowledge, Sci. Educ. , 572 (2002).[36] G. Zavala, H. Alarcón, and J. Benegas, Innovative trainingof in-service teachers for active learning: A short teacherdevelopment course based on Physics Education Research,J. Sci. Teach. Prep. , 559 (2007); N. G. Lederman, J.Gess-Newsome, and M. S. Latz, The nature and develop-ment of preservice science teachers ’ conceptions of subjectmatter and pedagogy, J. Res. Sci. Teach. , 129 (1994).[37] D. Zollman, Preparing future science teachers: the physicscomponent of a new programme, Phys. Educ. , 271(1994); H. Borko and R. Putnam, Learning to teach, in Handbook of Educational Psychology , edited by D.Berliner and R. Calfee (Macmillan, New York, 1996),pp. 673 – ş ildere, Investigating development ofpre-service elementary mathematics teachers ’ pedagogicalcontent knowledge through a school practicum course,Procedia — Soc. Behav. Sci. , 1410 (2010).[39] K. Carter, The place of story in the study of teaching andteacher education, Educ. Res. , 5 (1993).[40] D. M. Kagan, Ways of evaluating teacher cognition:Inferences concerning the Goldilocks Principle, Rev. Educ.Res. , 419 (1990).[41] J. J. Loughran, R. F. Gunstone, A. Berry, P. Milroy, and P.Mulhall, Science cases in action: Developing an under-standing of science teachers ’ pedagogical content knowl-edge, Paper presented at the Annual Meeting of theNational Association for Research in Science Teaching,New Orleans, LA, USA (2000).[42] J. A. Baxter and N. G. Lederman, Assessment and meas-urement of Pedagogical Content Knowledge, in
ExaminingPedagogical Content Knowledge: The Construct and itsImplications for Science Education , edited by J. Gess-Newsome and N. G. Lederman (Kluwer Academic,Dordrecht, 1999).[43] D. Hestenes, M. Wells, and G. Swackhamer, ForceConceptual Inventory, Phys. Teach. , 141 (1992). [44] A. Maries and C. Singh, Teaching assistants ’ performanceat identifying common introductory student difficultiesin mechanics revealed by the Force Concept Inventory,Phys. Rev. Phys. Educ. Res. , 010131 (2016).[45] R. Beichner, Testing student interpretation of kinematicsgraphs, Am. J. Phys. , 750 (1994).[46] A. Maries and C. Singh, Exploring one aspect of peda-gogical content knowledge of teaching assistants using thetest of understanding graphs in kinematics, Phys. Rev. STPhys. Educ. Res. , 020120 (2013).[47] L. C. McDermott and P. S. Schaffer, Tutorials in Intro-ductory Physics (Prentice Hall, Upper Saddle River, NJ,1998).[48] K. A. Ericsson and H. Simon, Verbal reports as data,Psychol. Rev. , 215 (1980).[49] See Supplemental Material at http://link.aps.org/supplemental/10.1103/PhysRevPhysEducRes.14.010117for mathematical description of how the CSEM-relatedPCK scores were calculated.[50] J. R. Thompson, W. M. Christensen, and M. C. Wittmann,Preparing future teachers to anticipate student difficultiesin physics in a graduate-level course in physics, pedagogy,and education research, Phys. Rev. ST Phys. Educ. Res. ,010108 (2011).[51] A. Maries and C. Singh, Performance of graduate studentsat identifying introductory students ’ difficulties withkinematics graphs, Proceedings of the Physics EducationConference 2014, Minneapolis, MN , edited by A.Churukian, P. Engelhardt, and D. Jones (2015), p. 171;N. I. Karim, A. Maries, and C. Singh, Teaching assistants ’ performance at identifying common introductory studentdifficulties revealed by the conceptual survey of electricityand magnetism, Proceedings of the Physics EducationConference 2017, Cincinnati, OH edited by L. Ding, A.Traxler, and Y. Cao (2018), p. 208.[52] C. Singh and E. Marshman, Review of student difficultiesin upper-level quantum mechanics, Phys. Rev. ST Phys.Educ. Res. , 020117 (2015).[53] E. Marshman and C. Singh, Framework for understandingthe patterns of student difficulties in quantum mechanics,Phys. Rev. ST Phys. Educ. Res. EXPLORING ONE ASPECT OF PEDAGOGICAL … PHYS. REV. PHYS. EDUC. RES.14,