Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian F. Patterson is active.

Publication


Featured researches published by Brian F. Patterson.


Journal of Applied Psychology | 2013

Test of Slope and Intercept Bias in College Admissions: A Response to Aguinis, Culpepper, and Pierce (2010).

Krista D. Mattern; Brian F. Patterson

Research on the predictive bias of cognitive tests has generally shown (a) no slope effects and (b) small intercept effects, typically favoring the minority group. Aguinis, Culpepper, and Pierce (2010) simulated data and demonstrated that statistical artifacts may have led to a lack of power to detect slope differences and an overestimate of the size of the intercept effect. In response to Aguinis et al.s (2010) call for a revival of predictive bias research, we used data on over 475,000 students entering college between 2006 and 2008 to estimate slope and intercept differences in the college admissions context. Corrections for statistical artifacts were applied. Furthermore, plotting of regression lines supplemented traditional analyses of predictive bias to offer additional evidence of the form and extent to which predictive bias exists. Congruent with previous research on bias of cognitive tests, using SAT scores in conjunction with high school grade-point average to predict first-year grade-point average revealed minimal differential prediction (ΔR²intercept ranged from .004 to .032 and ΔR²slope ranged from .001 to .013 depending on the corrections applied and comparison groups examined). We found, on the basis of regression plots, that college grades were consistently overpredicted for Black and Hispanic students and underpredicted for female students.


Educational Assessment | 2011

Contextual Factors Associated with the Validity of SAT Scores and High School GPA for Predicting First-Year College Grades.

Jennifer L. Kobrin; Brian F. Patterson

Prior research has shown that there is substantial variability in the degree to which the SAT and high school grade point average (HSGPA) predict 1st-year college performance at different institutions. This article demonstrates the usefulness of multilevel modeling as a tool to uncover institutional characteristics that are associated with this variability. The results revealed that the predictive validity of HSGPA decreased as mean total SAT (i.e., sum of the three SAT sections) score at an institution increased and as the proportion of White freshmen increased. The predictive validity of the three SAT sections (critical reading, mathematics, and writing) varied differently as a function of different institution-level variables. These results suggest that the estimates of validity obtained and aggregated from multiple institutions may not accurately reflect the unique contextual factors that influence the predictive validity of HSGPA and SAT scores at a particular institution.


Educational Assessment | 2011

Discrepant SAT Critical Reading and Writing Scores: Implications for College Performance

Emily J. Shaw; Krista D. Mattern; Brian F. Patterson

Despite the similarities that researchers note between the cognitive processes and knowledge involved in reading and writing, there are students who are much stronger readers than writers and those who are much stronger writers than readers. The addition of the writing section to the SAT provides an opportunity to examine whether certain groups of students are more likely to exhibit stronger performance in reading versus writing and the academic consequences of this discrepant performance. Results of this study, based on hierarchical linear models of student performance, showed that even after controlling for relevant student characteristics and prior academic performance, an SAT critical reading–writing discrepancy had a small effect on 1st-year grade point average as well as English course grades in college. Specifically, students who had relatively higher writing scores as compared to their critical reading scores earned higher grades in their 1st year of college as well as in their 1st-year English course(s).


Archive | 2008

Validity of the SAT for Predicting First-Year College Grade Point Average

Sandra M. Barbuti; Brian F. Patterson; Jennifer L. Kobrin; Krista D. Mattern


Archive | 2008

Differential Validity and Prediction of the SAT

Krista D. Mattern; Brian F. Patterson; Emily J. Shaw; Jennifer L. Kobrin; Sandra M. Barbuti


Archive | 2009

Is Performance on the SAT Related to College Retention

Krista D. Mattern; Brian F. Patterson


College Board | 2012

The Validity of the SAT ® for Predicting Cumulative Grade Point Average by College Major

Emily J. Shaw; Jennifer L. Kobrin; Brian F. Patterson; Krista D. Mattern


College Board | 2008

Validity of the SAT® for Predicting First-Year College Grade Point Average. Research Report No. 2008-5.

Jennifer L. Kobrin; Brian F. Patterson; Emily J. Shaw; Krista D. Mattern; Sandra M. Barbuti


Archive | 2011

Advanced Placement Exam-Taking and Performance: Relationships with First-Year Subject Area College Grades

Jennifer L. Kobrin; Brian F. Patterson; Sheryl Packman


Archive | 2009

Validity of the SAT for Predicting FYGPA-2007 SAT Validity Sample

Jennifer L. Kobrin; Brian F. Patterson; Krista D. Mattern

Collaboration


Dive into the Brian F. Patterson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge