Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christine DiStefano is active.

Publication


Featured researches published by Christine DiStefano.


Structural Equation Modeling | 2002

The Impact of Categorization With Confirmatory Factor Analysis

Christine DiStefano

This study investigated the impact of categorization on confirmatory factor analysis (CFA) parameter estimates, standard errors, and 5 ad hoc fit indexes. Models were generated that represented empirical research situations in terms of model size, sample sizes, and loading values. CFA results obtained from analysis of normally distributed, continuous data were compared to results obtained from 5-category Likert-type data with normal distributions. The ordered categorical data were analyzed using the estimators: Weighted Least Squares (WLS; with polychoric correlation [PC] input) and Maximum Likelihood (ML; with Pearson Product-Moment [PPM] input). ML-PPM-based parameter estimates reported moderate levels of negative bias for all conditions, WLS-PC-based standard errors showed high amounts of bias, especially with a small sample size and moderate loading values. With nonnormally distributed, ordered categorical data, ML-PPM-based parameter estimates, standard errors, and factor intercorrelation showed high levels of bias. Bias levels in standard errors were reduced when the Satorra-Bentler (1988) rescaling correction was applied to nonnormal, ordered categorical data. Five ad hoc model fit indexes appeared robust to the majority of study conditions.


Structural Equation Modeling | 2006

Further Investigating Method Effects Associated With Negatively Worded Items on Self-Report Surveys

Christine DiStefano; Robert W. Motl

This article used multitrait-multimethod methodology and covariance modeling for an investigation of the presence and correlates of method effects associated with negatively worded items on the Rosenberg Self-Esteem (RSE) scale (Rosenberg, 1989) using a sample of 757 adults. Results showed that method effects associated with negative item phrasing on the RSE scale were present. Method effects associated with negative item wording were similarly observed with the Social Physique Anxiety Scale (SPAS; Hart, Leary, & Rejeski, 1989) and method effects were present and significantly correlated in analyses that included both the RSE scale and SPAS simultaneously. Path analysis modeling that incorporated personality measures identified factors that correlated with the presence of method effects. These findings further suggest that method effects associated with negatively worded items may be considered a response style.


Journal of Psychoeducational Assessment | 2005

Using Confirmatory Factor Analysis for Construct Validation: An Empirical Review.

Christine DiStefano; Brian Hess

This study investigated the psychological assessment literature to determine what applied researchers are using and reporting from confirmatory factor analysis (CFA) studies for evidence of construct validation. One hundred and one articles published in four major psychological assessment journals between 1990 and 2002 were systematically reviewed. Information from each article was collected across four broad areas: Background, Data Screening, Reporting Results, and Discussing Results. Temporal trends were assessed with point-biserial correlations to determine how practices have changed over time. From the review, several recommendations were provided to assist assessment researchers report validity evidence from CFA studies.


Structural Equation Modeling | 2002

Longitudinal Invariance of Self-Esteem and Method Effects Associated With Negatively Worded Items

Robert W. Motl; Christine DiStefano

When developing self-report instruments, researchers often have included both positively and negatively worded items to negate the possibility of response bias. Unfortunately, this strategy may interfere with examinations of the latent structure of self-report instruments by introducing method effects, particularly among negatively worded items. The substantive nature of the method effects remains unclear and requires examination. Building on recommendations from previous researchers (Tomás& Oliver, 1999), this study examined the longitudinal invariance of method effects associated with negatively worded items using a self-report measure of global self-esteem. Data were obtained from the National Educational Longitudinal Study (NELS; Ingels et al., 1992) across 3 waves, each separated by 2 years, and the longitudinal invariance of the method effects was tested using LISREL 8.20 with weighted least squares estimation on polychoric correlations and an asymptotic variance/covariance matrix. Our results indicated that method effects associated with negatively worded items exhibited longitudinal invariance of the factor structure, factor loadings, item uniquenesses, factor variances, and factor covariances. Therefore, method effects associated with negatively worded items demonstrated invariance across time, similar to measures of personality traits, and should be considered of potential substantive importance. One possible substantive interpretation is a response style.


Educational and Psychological Measurement | 2006

Investigating Subtypes of Child Development: A Comparison of Cluster Analysis and Latent Class Cluster Analysis in Typology Creation.

Christine DiStefano; Randy W. Kamphaus

Two classification methods, latent class cluster analysis and cluster analysis, are used to identify groups of child behavioral adjustment underlying a sample of elementary school children aged 6 to 11 years. Behavioral rating information across 14 subscales was obtained from classroom teachers and used as input for analyses. Both the procedures and results were compared. The latent class cluster analysis uncovered three classes representing differing levels of childrens behavioral adjustment (well adjusted, average adjustment, functionally impaired), whereas the cluster analysis uncovered seven groups of child behavior. Results show a high degree of overlap, and each procedure offers unique information toward classifying child behavior.


Multivariate Behavioral Research | 1997

Behavioral Clustering of School Children.

Carl J. Huberty; Christine DiStefano; Randy W. Kamphaus

The intent of this article is to illustrate how a cluster analysis might be conducted, validated, and interpreted. Data normed for a behavioral assessment instrument with 14 scales on a sample drawn from a nationally representative pool of U.S. school children were utilized. The analysis discussed covers the cluster method, cluster typology, cluster validity, cluster structure, and prediction of cluster membership.


Structural Equation Modeling | 2014

A Comparison of Diagonal Weighted Least Squares Robust Estimation Techniques for Ordinal Data

Christine DiStefano; Grant B. Morgan

This study compared diagonal weighted least squares robust estimation techniques available in 2 popular statistical programs: diagonal weighted least squares (DWLS; LISREL version 8.80) and weighted least squares–mean (WLSM) and weighted least squares—mean and variance adjusted (WLSMV; Mplus version 6.11). A 20-item confirmatory factor analysis was estimated using item-level ordered categorical data. Three different nonnormality conditions were applied to 2- to 7-category data with sample sizes of 200, 400, and 800. Convergence problems were seen with nonnormal data when DWLS was used with few categories. Both DWLS and WLSMV produced accurate parameter estimates; however, bias in standard errors of parameter estimates was extreme for select conditions when nonnormal data were present. The robust estimators generally reported acceptable model–data fit, unless few categories were used with nonnormal data at smaller sample sizes; WLSMV yielded better fit than WLSM for most indices.


Psychological Assessment | 2011

Factor Structure of the BASC-2 Behavioral and Emotional Screening System Student Form.

Erin Dowdy; Jennifer M. Twyford; Jenna K. Chin; Christine DiStefano; Randy W. Kamphaus; Kristen L. Mays

The BASC-2 Behavioral and Emotional Screening System (BESS) Student Form (Kamphaus & Reynolds, 2007) is a recently developed youth self-report rating scale designed to identify students at risk for behavioral and emotional problems. The BESS Student Form was derived from the Behavior Assessment System for Children-Second Edition Self-Report of Personality (BASC-2 SRP; Reynolds & Kamphaus, 2004) using principal component analytic procedures and theoretical considerations. Using 3 samples, the authors conducted exploratory factor analyses (EFA) and confirmatory factor analyses (CFA) to understand the underlying factor structure of the BESS Student Form. The results of the EFA suggested that the SRP contained a 4-factor (i.e., Personal Adjustment, Inattention/Hyperactivity, Internalizing, School Problems) emergent structure, which was supported by CFA in 2 additional samples. Practical and research implications are discussed.


Journal of Psychoeducational Assessment | 2006

Investigating the Theoretical Structure of the Stanford-Binet-Fifth Edition

Christine DiStefano; Stefan C. Dombrowski

The fifth edition of the Stanford-Binet test went through significant reformulation of its item content, administration format, standardization procedures, and theoretical structure. Additionally, the test was revised to measure five factors important to intelligence across both verbal and nonverbal domains. To better understand these substantial revisions, the underlying factor structure of the instrument was investigated using both exploratory and confirmatory factor analysis procedures across five age groups tested by the publishers. Analyses were conducted using 4,800 cases included in the instrument standardization. Results suggested that the verbal/nonverbal domains were identifiable with subjects younger than 10 years of age whereas a single factor was readily identified with older age groups.


Structural Equation Modeling | 2009

Self-Esteem and Method Effects Associated With Negatively Worded Items: Investigating Factorial Invariance by Sex

Christine DiStefano; Robert W. Motl

The Rosenberg Self-Esteem scale (RSE) has been widely used in examinations of sex differences in global self-esteem. However, previous examinations of sex differences have not accounted for method effects associated with item wording, which have consistently been reported by researchers using the RSE. Accordingly, this study examined the multigroup invariance of global self-esteem and method effects associated with negatively worded items on the RSE between males and females. A correlated traits, correlated methods framework for modeling method effects was combined with a standard multigroup invariance routine using covariance structure analysis. Overall, there were few differences between males and females in terms of the measurement of self-esteem and method effects associated with negatively worded items on the RSE. Our findings suggest that, whereas method effects exist on the RSE scale for both males and females, the method effects associated with negatively worded items do not influence the measurement invariance and mean differences in global self-esteem scores between the sexes.

Collaboration


Dive into the Christine DiStefano's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jin Liu

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Erin Dowdy

University of California

View shared research outputs
Top Co-Authors

Avatar

Fred W. Greer

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Grant B. Morgan

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Robert W. Motl

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Diana Mindrila

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Yin Burgess

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Diane M. Monrad

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Jenna K. Chin

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge