Marlies Vervloet
Katholieke Universiteit Leuven
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Marlies Vervloet.
Journal of Experimental Education | 2017
Mieke Heyvaert; Mariola Moeyaert; Paul Verkempynck; Wim Van Den Noortgate; Marlies Vervloet; Maaike Ugille; Patrick Onghena
ABSTRACT This article reports on a Monte Carlo simulation study, evaluating two approaches for testing the intervention effect in replicated randomized AB designs: two-level hierarchical linear modeling (HLM) and using the additive method to combine randomization test p values (RTcombiP). Four factors were manipulated: mean intervention effect, number of cases included in a study, number of measurement occasions for each case, and between-case variance. Under the simulated conditions, Type I error rate was under control at the nominal 5% level for both HLM and RTcombiP. Furthermore, for both procedures, a larger number of combined cases resulted in higher statistical power, with many realistic conditions reaching statistical power of 80% or higher. Smaller values for the between-case variance resulted in higher power for HLM. A larger number of data points resulted in higher power for RTcombiP.
Behavior Research Methods | 2018
Marlies Vervloet; Wim Van den Noortgate; Eva Ceulemans
Behavioral researchers often linearly regress a criterion on multiple predictors, aiming to gain insight into the relations between the criterion and predictors. Obtaining this insight from the ordinary least squares (OLS) regression solution may be troublesome, because OLS regression weights show only the effect of a predictor on top of the effects of other predictors. Moreover, when the number of predictors grows larger, it becomes likely that the predictors will be highly collinear, which makes the regression weights’ estimates unstable (i.e., the “bouncing beta” problem). Among other procedures, dimension-reduction-based methods have been proposed for dealing with these problems. These methods yield insight into the data by reducing the predictors to a smaller number of summarizing variables and regressing the criterion on these summarizing variables. Two promising methods are principal-covariate regression (PCovR) and exploratory structural equation modeling (ESEM). Both simultaneously optimize reduction and prediction, but they are based on different frameworks. The resulting solutions have not yet been compared; it is thus unclear what the strengths and weaknesses are of both methods. In this article, we focus on the extents to which PCovR and ESEM are able to extract the factors that truly underlie the predictor scores and can predict a single criterion. The results of two simulation studies showed that for a typical behavioral dataset, ESEM (using the BIC for model selection) in this regard is successful more often than PCovR. Yet, in 93% of the datasets PCovR performed equally well, and in the case of 48 predictors, 100 observations, and large differences in the strengths of the factors, PCovR even outperformed ESEM.
Journal of Statistical Software | 2015
Marlies Vervloet; Henk A. L. Kiers; Wim Van Den Noortgate; Eva Ceulemans
Chemometrics and Intelligent Laboratory Systems | 2013
Marlies Vervloet; Katrijn Van Deun; Wim Van Den Noortgate; Eva Ceulemans
Chemometrics and Intelligent Laboratory Systems | 2016
Marlies Vervloet; Katrijn Van Deun; Wim Van Den Noortgate; Eva Ceulemans
Archive | 2016
Marlies Vervloet; Katrijn Van Deun; Wim Van den Noortgate; Eva Ceulemans
International Federation of Classification Societies | 2015
Marlies Vervloet; Katrijn Van Deun; Wim Van Den Noortgate; Eva Ceulemans
Archive | 2014
Marlies Vervloet; Wim Van Den Noortgate; Eva Ceulemans
Archive | 2013
Marlies Vervloet; Katrijn Van Deun; Wim Van Den Noortgate; Eva Ceulemans
Archive | 2012
Marlies Vervloet; Eva Ceulemans; Marieke E. Timmerman; Wim Van Den Noortgate