Jennifer McGee
Appalachian State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jennifer McGee.
Journal of Research in Childhood Education | 2014
Drew Polly; Chuang Wang; Jennifer McGee; Richard G. Lambert; Christie Martin; David Pugalee
This study presents findings from the first cohort of teachers in a U.S. Department of Education Mathematics Science Partnership (MSP) grant designed to support the use of a standards-based elementary school mathematics curriculum, Investigations in Number, Data, and Space (Investigations). In line with the goals of the MSP program, the 84-hour professional development program focused on building teachers’ knowledge of mathematics content, examining how the mathematics content is embedded into curriculum, and supporting teachers’ enactment of reform-based pedagogies. Teacher participants had a positive gain in their content knowledge, but this increase did not have any statistically significant impact on student gains in the assessment of mathematics proficiency. Results about teacher beliefs were inconclusive, as more time is needed to change teacher beliefs. Teachers who changed their practices from teacher centered to student centered found their students with statistically more gains in their performance in curriculum-based mathematics assessments. Discussions and implications of these findings were also presented.
Action in teacher education | 2014
Jennifer McGee; Susan A. Colby
Understanding effective and appropriate techniques for assessing student learning has never been as critical and political as is it is today. Accurate assessment methodologies employed in public schools are vital to the interpretation of student achievement data through which school effectiveness is being measured. Although there is a need, research has shown that preservice teachers are not receiving adequate training in their teacher preparation courses. The authors of this study examined the assessment literacy of teacher candidates before and after completing a required assessment course in their teacher preparation program. Teacher candidates (N = 190) were asked to complete the Assessment Literacy Inventory as a pretest measure and posttest measure. Findings revealed that certain aspects of assessment literacy were present before the course, but also that exposure to the course potentially increased assessment literacy in some areas. Implications for researchers and practitioners are included.
Journal of Psychoeducational Assessment | 2014
Jennifer McGee; Chuang Wang
The purpose of this study is to provide evidence of reliability and validity of the Self-Efficacy for Teaching Mathematics Instrument (SETMI). Self-efficacy, as defined by Bandura, was the theoretical framework for the development of the instrument. The complex belief systems of mathematics teachers, as touted by Ernest provided insights into the elements of mathematics beliefs that could be relative to a teacher’s self-efficacy beliefs. The SETMI was developed in July 2010 and has undergone revisions to the original version through processes defined in this study. Evidence of reliability and validity were collected to determine whether the SETMI is an adequate instrument to measure self-efficacy of elementary mathematics teachers. Construct validity of the revised SETMI was tested using confirmatory factor analysis. Findings indicate that the SETMI is a valid and reliable measure of two aspects of self-efficacy: pedagogy in mathematics and teaching mathematics content.
British Journal of Education, Society & Behavioural Science | 2014
Richard G. Lambert; Bob Algozzine; Jennifer McGee
Aims: In this research, we evaluated the effects of progress monitoring grounded in a commercially-available tool used to customize assignments and ke ep track of progress in mathematics for students in elementary school. Study Design: We used a randomized controlled trial and multilevel analysis to test the effect of the treatment on the outcome measures while nesting students within their classroom. Place and Duration of Study:Students in three elementary schools in the Midwestern region of the United States were in the study which took place across an academic year. Methodology: We used two-level hierarchical linear models for our analyses because of the nested nature of our data. We compared outcomes across high and lowimplementation fidelity treatment group classrooms as well as across treatment and control classrooms. Results:We found statistically significant treatment differences for monthly gr owth rate and elementary school fidelity of implementation effects were documented. Conclusion: Professionals engaged in progress monitoring use a variety of measures to track student performance and to assist in instructional decision making when data ind icate a need for change. We found that the use of a computer -based individualized mathematics assignment and progress monitoring program resulted in improvements in both curriculum
Reading & Writing Quarterly | 2017
Darrell Morris; Carla K. Meyer; Woodrow Trathen; Jennifer McGee; Nora Vines; Trevor Thomas Stewart; Tom Gill; Robert Schlagal
ABSTRACT This study explored print-processing and vocabulary differences among a group of 5th- and 6th-grade students who had scored below the 50th percentile on a standardized reading test. Guided by the simple view of reading, we applied cut scores (low/high) to the students’ performance on print-processing and vocabulary tasks. The design allowed for the placement of students in 1 of 4 reader profiles: (a) high print processing/low vocabulary (25%), (b) high print processing/high vocabulary (14%), (c) low print processing/high vocabulary (14%), or (d) low print processing/low vocabulary (48%). An important finding was that 62% of the students could not read grade-level text with adequate accuracy and rate. In fact, many could not read comfortably a full level below their grade placement. We consider instructional implications.
SAGE Open | 2016
Susan A. Colby; Monica Lambert; Jennifer McGee
In this results-oriented era of accountability, educator preparation programs are called upon to provide comprehensive data related to student and program outcomes while also providing evidence of continuous improvement. Collaborative Analysis of Student Learning (CASL) is one approach for fostering critical inquiry about student learning. Graduate educator preparation programs in our university used collaborative analysis as the basis for continuous improvement during an accreditation cycle. As authors of this study, we sought to better understand how graduate program directors and faculty used collaborative analysis to inform practice and improve programs. Our findings suggested that CASL has the potential to foster collective responsibility for student learning, but only with a strong commitment from administrators and faculty, purposefully designed protocols and processes, fidelity to the CASL method, and a focus on professional development. Through CASL, programs have the ability to produce meaningful data related to student and program outcomes and meet the requirements for accreditation.
The Mathematics Educator | 2013
Drew Polly; Jennifer McGee; Chuang Wang; Richard G. Lambert; David Pugalee; Sarah Johnson
School Science and Mathematics | 2013
Jennifer McGee; Chuang Wang; Drew Polly
Early Childhood Education Journal | 2017
Drew Polly; Chuang Wang; Richard G. Lambert; Christie Martin; Jennifer McGee; David Pugalee; Amy Lehew
Early Childhood Education Journal | 2017
Drew Polly; Christie S. Martin; Jennifer McGee; Chuang Wang; Richard G. Lambert; David Pugalee