Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eric Parsons is active.

Publication


Featured researches published by Eric Parsons.


Education Finance and Policy | 2015

TEACHER PREPARATION PROGRAMS AND TEACHER QUALITY: ARE THERE REAL DIFFERENCES ACROSS PROGRAMS?

Cory Koedel; Eric Parsons; Michael Podgursky; Mark Ehlert

We compare teacher preparation programs in Missouri based on the effectiveness of their graduates in the classroom. The differences in effectiveness between teachers from different preparation programs are much smaller than has been suggested in previous work. In fact, virtually all of the variation in teacher effectiveness comes from within-program differences between teachers. Prior research has overstated differences in teacher performance across preparation programs by failing to properly account for teacher sampling.


B E Journal of Economic Analysis & Policy | 2012

Test Measurement Error and Inference from Value-Added Models

Cory Koedel; Rebecca Leatherman; Eric Parsons

Abstract It is widely known that standardized tests are noisy measures of student learning, but value added models (VAMs) rarely account for test measurement error (TME). We incorporate information about TME directly into VAMs, focusing on TME that derives from the testing instrument itself. Our analysis is divided into two parts – one based on simulated data and the other based on administrative micro data from Missouri. In the simulations we control the data generating process, which ensures that we obtain accurate TME metrics. In the real-data portion of our analysis we use estimates of TME provided by a major test publisher. In both the simulations and real-data analyses, we find that inference from VAMs is improved by making simple TME adjustments to the models. The improvement is larger in the simulations, but even in the real-data analysis the improvement is on the order of what one could expect if teacher-level sample sizes were increased by 11 to 17 percent.


Statistics and Public Policy | 2014

The Sensitivity of Value-Added Estimates to Specification Adjustments: Evidence From School- and Teacher-Level Models in Missouri

Mark Ehlert; Cory Koedel; Eric Parsons; Michael Podgursky

We provide a side-by-side comparison of school and teacher growth measures estimated from different value-added models (VAMs). We compare VAMs that differ in terms of which student and school-level (or teacher-level) control variables are included and how these controls are included. Our richest specification includes 3 years of prior test scores for students and the standard demographic controls; our sparsest specification conditions only on a single prior test score. For both schools and teachers, the correlations between VAM estimates across the different models are high by conventional standards (typically at or above 0.90). However, despite the high correlations overall, we show that the choice of which controls to include in VAMs, and how to include them, meaningfully influences school and teacher rankings based on model output. Models that are less aggressive in controlling for student-background and schooling-environment information systematically assign higher rankings to more-advantaged schools, and to individuals who teach at these schools.


Educational Policy | 2016

Selecting Growth Measures for Use in School Evaluation Systems Should Proportionality Matter

Mark Ehlert; Cory Koedel; Eric Parsons; Michael Podgursky

The specifics of how growth models should be constructed and used for educational evaluation is a topic of lively policy debate in states and school districts nationwide. In this article, we take up the question of model choice—framed within a policy context—and examine three competing approaches. The first approach, reflected in the popular student growth percentiles (SGPs) framework, eschews all controls for student covariates and schooling environments. The second approach, typically associated with value-added models (VAMs), controls for student-background characteristics and under some conditions can be used to identify the causal effects of educational units (i.e., districts, schools, and teachers). The third approach, also VAM-based, fully levels the playing field so that the correlation between the growth measures and student demographics is essentially zero. We argue that the third approach is the most desirable for use in school evaluation systems. Our case rests on personnel economics, incentive-design theory, and the potential role that growth measures can play in improving instruction in K-12 schools.


Journal of Research on Educational Effectiveness | 2015

Incorporating End-of-Course Exam Timing Into Educational Performance Evaluations

Eric Parsons; Cory Koedel; Michael Podgursky; Mark Ehlert; P. Brett Xiang

Abstract There is increased policy interest in extending test-based evaluations in K–12 education to include student achievement in high school. High school achievement is typically measured by performance on end-of-course exams (EOCs), which test course-specific standards in a variety of subjects. However, unlike standardized tests in the early grades, students take EOCs at different points in their schooling careers. The timing of the test is a choice variable presumably determined by input from administrators, students, and parents. Recent research indicates that school and district policies that determine when students take particular courses can have important consequences for achievement and subsequent outcomes such as advanced course taking. We develop an approach for modeling EOC test performance that disentangles the influence of school and district policies regarding the timing of course taking from other factors. After separating out the timing issue, better measures of the quality of instruction provided by districts, schools, and teachers can be obtained. Our approach also offers diagnostic value because it separates out the influence of school and district course-timing policies from other factors that determine student achievement.


Journal of Educational and Behavioral Statistics | 2018

Accounting for Student Disadvantage in Value-Added Models

Eric Parsons; Cory Koedel; Li Tan

We study the relative performance of two policy-relevant value-added models—a one-step fixed effect model and a two-step aggregated residuals model—using a simulated data set well grounded in the value-added literature. A key feature of our data generating process is that student achievement depends on a continuous measure of economic disadvantage. This is a realistic condition that has implications for model performance because researchers typically have access to only a noisy, binary measure of disadvantage. We find that one- and two-step value-added models perform similarly across a wide range of student and teacher sorting conditions, with the two-step model modestly outperforming the one-step model in conditions that best match observed sorting in real data. A reason for the generally superior performance of the two-step model is that it better handles the use of an error-prone, dichotomous proxy for student disadvantage.


Archive | 2014

Selecting Growth Measures for School and Teacher Evaluations

Cory Koedel; Mark Ehlert; Eric Parsons; Michael Podgursky; P. Brett Xiang


Archive | 2013

Selecting Growth Measures for School and Teacher Evaluations: Should Proportionality Matter?

Mark Ehlert; Cory Koedel; Eric Parsons; Michael Podgursky


National Bureau of Economic Research | 2016

The Compositional Effect of Rigorous Teacher Evaluation on Workforce Quality

Julie Berry Cullen; Cory Koedel; Eric Parsons


Education Next | 2014

Choosing the Right Growth Measure.

Mark Ehlert; Cory Koedel; Eric Parsons; Michael Podgursky

Collaboration


Dive into the Eric Parsons's collaboration.

Top Co-Authors

Avatar

Cory Koedel

University of Missouri

View shared research outputs
Top Co-Authors

Avatar

Mark Ehlert

University of Missouri

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Julie Berry Cullen

National Bureau of Economic Research

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Li Tan

University of Missouri

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge