Leigh M. Harrell-Williams
Georgia State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Leigh M. Harrell-Williams.
Psychological Assessment | 2015
Leigh M. Harrell-Williams; Tara C. Raines; Randy W. Kamphaus; Bridget V. Dever
The Behavioral and Emotional Screening System (BESS) is a relatively new method for identifying behavior and emotional risk (BER) in children and adolescents. Psychometric evidence regarding this instrument is important for researchers and practitioners considering the use of the BESS for identifying BER in students. Previous psychometric research specifically regarding the BESS Student Form involved the use of samples of elementary and middle school-age children. This study adds to the psychometric evidence for scores on the BESS Student Form by using samples of high school aged students to assess both the factor structure reported by Dowdy, Twyford et al. (2011) and the measurement invariance of the BESS items with regard to ethnicity, English language proficiency, and socioeconomic status. The results indicate that while the proposed 4-factor structure of the BESS Student Form is appropriate, lower than preferred reliabilities for some of the factors indicates that reporting the overall risk T score is more appropriate than reporting factor scores for risk classification purposes. Additionally, the BESS Student Form items did not exhibit measurement bias when comparing across ethnicities, language proficiency classification, or socioeconomic status (via free/reduced lunch classification).
Journal of Psychoeducational Assessment | 2014
Leigh M. Harrell-Williams; M. Alejandra Sorto; Rebecca L. Pierce; Lawrence M. Lesser; Teri J. Murphy
The influential Common Core State Standards for Mathematics (CCSSM) expect students to start statistics learning during middle grades. Thus teacher education and professional development programs are advised to help preservice and in-service teachers increase their knowledge and confidence to teach statistics. Although existing self-efficacy instruments used in statistics education focus on students, the Self-Efficacy to Teach Statistics (SETS) instrument measures a teacher’s efficacy to teach key CCSSM statistical topics. Using the results from a sample of n = 309 participants enrolled in a mathematics education or introductory statistics course, SETS scores were validated for use with middle grades preservice teachers to differentiate levels of self-efficacy to teach statistics. Confirmatory factor analysis using the Multidimensional Random Coefficient Multinomial Logit Model supports the use of two dimensions, which exhibit adequate reliabilities and correspond to the first two levels of the Guidelines for Assessment and Instruction in Statistics Education adopted by the American Statistical Association. Item and rating scale analyses indicate that the items and the six-category scale perform as intended. These indicators suggest that the SETS instrument may be appropriate for measuring preservice teacher levels of self-efficacy to teach statistics.
Journal of Psychoeducational Assessment | 2014
Sarah Kiperman; Mary S. Black; Tia McGill; Leigh M. Harrell-Williams; Randy W. Kamphaus
This study assesses the ability of a brief screening form, the Behavioral and Emotional Screening System–Student Form (BESS-SF), to predict scores on the much longer form from which it was derived: the Behavior Assessment System for Children–Second Edition Self-Report of Personality–Child Form (BASC-2-SRP-C). The present study replicates a former study included in the BESS manual with an entirely new sample. Participants included 252 students from a large, urban, Southwestern U.S. city school district in the third through fifth grades. The sample’s ethnic majority was Hispanic (81.7%). Results revealed high specificity and negative predictive values between the screener and omnibus form, suggesting a child who identifies as not “at-risk” on the BESS-SF will likely identify as not “at-risk” on the BASC-2-SRP-C domains. These results effectively replicate the previous findings with a new sample of largely Hispanic (Latino/a) students from a large urban school district.
Journal of Psychoeducational Assessment | 2017
Leigh M. Harrell-Williams; Jennifer N. Lovett; Hollylynne S. Lee; Rebecca L. Pierce; Lawrence M. Lesser; M. Alejandra Sorto
Recently adopted state standards for middle grades and high school mathematics content have an increased emphasis on statistical topics. With this change, teacher education programs may need to adapt how they prepare preservice secondary mathematics teachers (PSMTs) to teach statistics and require measures related to statistics teaching to assess the impact of programmatic changes and track teacher growth. Using responses from a sample of 290 PSMTs from 20 institutions across the United States, this study presents validity and reliability evidence for the high school version of the Self-Efficacy to Teach Statistics (SETS-HS), which could be used to assess statistics teaching efficacy. Confirmatory factor analysis results via Rasch modeling support the use of three subscales, which exhibit adequate reliabilities and correspond to the three levels in the Pre-K–12 Guidelines for Assessment and Instruction in Statistics Education endorsed by the American Statistical Association. Item and rating scale analyses indicate that the 46 items and the six-category scale employed in the SETS-HS perform as intended.
Journal of Applied School Psychology | 2017
Tara C. Raines; Melissa Gordon; Leigh M. Harrell-Williams; Rachele Diliberto; Elyse M. Parke
ABSTRACT Interventions developed to improve adaptive skills can improve academic achievement. The authors expanded this line of research by examining the relationship between performance on a state proficiency exam and adaptive skills classifications on the Behavioral Assessment System for Children, Second Edition parent and teacher reports. Participants included 392 Latino students, Grades 2–6 in a large urban school district. Ordinal regression models were used to assess relationships between student academic proficiency level and adaptive skills classifications. Students classified as having higher adaptive skills by teachers were more likely to be classified as proficient or higher in reading and mathematics. These findings further support the relationship between adaptive skills and academic achievement. Implications for future research and practice are discussed.
Educational and Psychological Measurement | 2013
Leigh M. Harrell-Williams; Edward W. Wolfe
Most research on confirmatory factor analysis using information-based fit indices (Akaike information criterion [AIC], Bayesian information criteria [BIC], bias-corrected AIC [AICc], and consistent AIC [CAIC]) has used a structural equation modeling framework. Minimal research has been done concerning application of these indices to item response models, especially within the framework of multidimensional Rasch analysis with an emphasis of the role of between-dimension correlation on index accuracy. We investigated how sample size, between-dimension correlation, model-to-data misfit, and test length affect the accuracy of these indices in model recovery in dichotomous data using a multidimensional Rasch analysis simulation methodology. Results reveal that, at higher values of between-dimension correlation, AIC indicated the correct two-dimension generating structure slightly more often than the BIC or CAIC. The results also demonstrated that violations of the Rasch model assumptions are magnified at higher between-dimension correlations. We recommend that practitioners working with highly correlated multidimensional data use moderate length (roughly 40 items) instruments and minimize data-to-model misfit in the choice of model used for confirmatory factor analysis (multidimensional random coefficient multinomial logit or other multidimensional item response theory models).
Journal of Statistics Education | 2015
Leigh M. Harrell-Williams; M. Alejandra Sorto; Rebecca L. Pierce; Lawrence M. Lesser; Teri J. Murphy
Archive | 2014
Leigh M. Harrell-Williams; M. Alejandra Sorto; Rebecca L. Pierce; Lawrence M. Lesser; Teri J. Murphy
School Psychology Review | 2016
Erin Dowdy; Leigh M. Harrell-Williams; Bridget V. Dever; Michael J. Furlong; Stephanie Moore; Tara C. Raines; Randy W. Kamphaus
Archive | 2014
Shannon M. Suldo; Melanie M. McMahan; Ashley M. Chappel; Lisa P. Bateman; Richard G. Lambert; Do-Hong Kim; Diane C. Burts; Leigh M. Harrell-Williams; M. Alejandra Sorto; Rebecca L. Pierce; Lawrence M. Lesser; Teri J. Murphy; Matthew C. Lambert; Michael H. Epstein; Douglas Cullinan