Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sven Kepes is active.

Publication


Featured researches published by Sven Kepes.


Organizational Research Methods | 2012

Publication Bias in the Organizational Sciences

Sven Kepes; George C. Banks; Michael A. McDaniel; Deborah L. Whetzel

Publication bias poses multiple threats to the accuracy of meta-analytically derived effect sizes and related statistics. Unfortunately, a review of the literature indicates that unlike meta-analytic reviews in medicine, research in the organizational sciences tends to pay little attention to this issue. In this article, the authors introduce advances in meta-analytic techniques from the medical and related sciences for a comprehensive assessment and evaluation of publication bias. The authors illustrate their use on a data set on employment interview validities. Using multiple methods, including contour-enhanced funnel plots, trim and fill, Egger’s test of the intercept, Begg and Mazumdar’s rank correlation, meta-regression, cumulative meta-analysis, and selection models, the authors find limited evidence of publication bias in the studied data.


Educational Evaluation and Policy Analysis | 2012

Publication Bias: The Antagonist of Meta-Analytic Reviews and Effective Policymaking

George C. Banks; Sven Kepes; Karen P. Banks

This article offers three contributions for conducting meta-analytic reviews in education research. First, we review publication bias and the challenges it presents for meta-analytic researchers. Second, we review the most recent and optimal techniques for evaluating the presence and influence of publication bias in meta-analyses. We then re-analyze two sets of meta-analytic data from the literacy literature that have been published in different journals. The analyses serve as case examples of the techniques reviewed, and the results demonstrate a range of findings from noticeable instances of publication bias to minimal or no bias. The conclusions have important implications for research, policymaking, and practice. Finally, we discuss recommendations for future research.


PLOS ONE | 2015

The Validity of Conscientiousness Is Overestimated in the Prediction of Job Performance

Sven Kepes; Michael A. McDaniel

Introduction Sensitivity analyses refer to investigations of the degree to which the results of a meta-analysis remain stable when conditions of the data or the analysis change. To the extent that results remain stable, one can refer to them as robust. Sensitivity analyses are rarely conducted in the organizational science literature. Despite conscientiousness being a valued predictor in employment selection, sensitivity analyses have not been conducted with respect to meta-analytic estimates of the correlation (i.e., validity) between conscientiousness and job performance. Methods To address this deficiency, we reanalyzed the largest collection of conscientiousness validity data in the personnel selection literature and conducted a variety of sensitivity analyses. Results Publication bias analyses demonstrated that the validity of conscientiousness is moderately overestimated (by around 30%; a correlation difference of about .06). The misestimation of the validity appears to be due primarily to suppression of small effects sizes in the journal literature. These inflated validity estimates result in an overestimate of the dollar utility of personnel selection by millions of dollars and should be of considerable concern for organizations. Conclusion The fields of management and applied psychology seldom conduct sensitivity analyses. Through the use of sensitivity analyses, this paper documents that the existing literature overestimates the validity of conscientiousness in the prediction of job performance. Our data show that effect sizes from journal articles are largely responsible for this overestimation.


Journal of Applied Psychology | 2013

Assessing the validity of sales self-efficacy: a cautionary tale.

Daniel C. Ganster; Sven Kepes

We developed a focused, context-specific measure of sales self-efficacy and assessed its incremental validity against the broad Big 5 personality traits with department store salespersons, using (a) both a concurrent and a predictive design and (b) both objective sales measures and supervisory ratings of performance. We found that in the concurrent study, sales self-efficacy predicted objective and subjective measures of job performance more than did the Big 5 measures. Significant differences between the predictability of subjective and objective measures of performance were not observed. Predictive validity coefficients were generally lower than concurrent validity coefficients. The results suggest that there are different dynamics operating in concurrent and predictive designs and between broad and contextualized measures; they highlight the importance of distinguishing between these designs and measures in meta-analyses. The results also point to the value of focused, context-specific personality predictors in selection research.


Psychological Bulletin | 2017

Violent video game effects remain a societal concern: Reply to Hilgard, Engelhardt, and Rouder (2017).

Sven Kepes; Brad J. Bushman; Craig A. Anderson

A large meta-analysis by Anderson et al. (2010) found that violent video games increased aggressive thoughts, angry feelings, physiological arousal, and aggressive behavior and decreased empathic feelings and helping behavior. Hilgard, Engelhardt, and Rouder (2017) reanalyzed the data of Anderson et al. (2010) using newer publication bias methods (i.e., precision-effect test, precision-effect estimate with standard error, p-uniform, p-curve). Based on their reanalysis, Hilgard, Engelhardt, and Rouder concluded that experimental studies examining the effect of violent video games on aggressive affect and aggressive behavior may be contaminated by publication bias, and these effects are very small when corrected for publication bias. However, the newer methods Hilgard, Engelhardt, and Rouder used may not be the most appropriate. Because publication bias is a potential a problem in any scientific domain, we used a comprehensive sensitivity analysis battery to examine the influence of publication bias and outliers on the experimental effects reported by Anderson et al. We used best meta-analytic practices and the triangulation approach to locate the likely position of the true mean effect size estimates. Using this methodological approach, we found that the combined adverse effects of outliers and publication bias was less severe than what Hilgard, Engelhardt, and Rouder found for publication bias alone. Moreover, the obtained mean effects using recommended methods and practices were not very small in size. The results of the methods used by Hilgard, Engelhardt, and Rouder tended to not converge well with the results of the methods we used, indicating potentially poor performance. We therefore conclude that violent video game effects should remain a societal concern.


Personality and Social Psychology Review | 2018

Effects of Weapons on Aggressive Thoughts, Angry Feelings, Hostile Appraisals, and Aggressive Behavior: A Meta-Analytic Review of the Weapons Effect Literature:

Arlin James Benjamin; Sven Kepes; Brad J. Bushman

A landmark 1967 study showed that simply seeing a gun can increase aggression—called the “weapons effect.” Since 1967, many other studies have attempted to replicate and explain the weapons effect. This meta-analysis integrates the findings of weapons effect studies conducted from 1967 to 2017 and uses the General Aggression Model (GAM) to explain the weapons effect. It includes 151 effect-size estimates from 78 independent studies involving 7,668 participants. As predicted by the GAM, our naïve meta-analytic results indicate that the mere presence of weapons increased aggressive thoughts, hostile appraisals, and aggression, suggesting a cognitive route from weapons to aggression. Weapons did not significantly increase angry feelings. Yet, a comprehensive sensitivity analysis indicated that not all naïve mean estimates were robust to the presence of publication bias. In general, these results suggest that the published literature tends to overestimate the weapons effect for some outcomes and moderators.


International Journal of Selection and Assessment | 2014

An Evaluation of Spearman's Hypothesis by Manipulating G Saturation

Michael A. McDaniel; Sven Kepes

Spearmans Hypothesis holds that the magnitude of mean White–Black differences on cognitive tests covaries with the extent to which a test is saturated with g. This paper evaluates Spearmans Hypothesis by manipulating the g saturation of cognitive composites. Using a sample of 16,384 people from the General Aptitude Test Battery database, we show that one can decrease mean racial differences in a g test by altering the g saturation of the measure. Consistent with Spearmans Hypothesis, the g saturation of a test is positively and strongly related to the magnitude of White–Black mean racial differences in test scores. We demonstrate that the reduction in mean racial differences accomplished by reducing the g saturation in a measure is obtained at the cost of lower validity and increased prediction errors. We recommend that g tests varying in mean racial differences be examined to determine if the Spearmans Hypothesis is a viable explanation for the results.


Academy of Management Proceedings | 2018

Assessing the Trustworthiness of our Cumulative Knowledge in Learning, Behavior, and Performance

Sheila K. List; Sven Kepes; Michael A. McDaniel; Xavier MacDaniel

Meta-analytic studies are the primary way for systematically synthesizing quantitative research findings to cumulate knowledge. As such, they have substantial influence on research and practice. Re...


Leadership Quarterly | 2007

Destructive leader traits and the neutralizing influence of an "enriched" job

John Schaubroeck; Fred O. Walumbwa; Daniel C. Ganster; Sven Kepes


Personnel Psychology | 2009

CONTINGENCIES IN THE EFFECTS OF PAY RANGE ON ORGANIZATIONAL EFFECTIVENESS

Sven Kepes; John E. Delery

Collaboration


Dive into the Sven Kepes's collaboration.

Top Co-Authors

Avatar

Michael A. McDaniel

Virginia Commonwealth University

View shared research outputs
Top Co-Authors

Avatar

George C. Banks

University of North Carolina at Charlotte

View shared research outputs
Top Co-Authors

Avatar

Sheila K. List

Virginia Commonwealth University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John H. Batchelor

University of West Florida

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge