Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Olena Kaminska is active.

Publication


Featured researches published by Olena Kaminska.


Journal of Official Statistics | 2017

Survey-Based Cross-Country Comparisons Where Countries Vary in Sample Design: Issues and Solutions

Olena Kaminska; Peter Lynn

Abstract In multi-national surveys, different countries usually implement different sample designs. The sample designs affect the variance of estimates of differences between countries. When making such estimates, analysts often fail to take sufficient account of sample design. This failure occurs sometimes because variables indicating stratification, clustering, or weighting are unavailable, partially available, or in a form that is unsuitable for cross-national analysis. In this article, we demonstrate how complex sample design should be taken into account when estimating differences between countries, and we provide practical guidance to analysts and to data producers on how to deal with partial or inappropriately-coded sample design indicator variables. Using EU-SILC as a case study, we evaluate the inverse misspecification effect (imeff ) that results from ignoring clustering or stratification, or both in a between-country comparison where countries’ sample designs differ. We present imeff for estimates of between-country differences in a number of demographic and economic variables for 19 European Union Member States. We assess the magnitude of imeff and the associated impact on standard error estimates. Our empirical findings illustrate that it is important for data producers to supply appropriate sample design indicators and for analysts to use them.


Journal of Official Statistics | 2014

Panel Attrition: How Important is Interviewer Continuity?

Peter Lynn; Olena Kaminska; Harvey Goldstein

Abstract We assess whether the probability of a sample member cooperating at a particular wave of a panel survey is greater if the same interviewer is deployed as at the previous wave. Previous research on this topic mainly uses nonexperimental data. Consequently, a) interviewer change is generally nonrandom, and b) continuing interviewers are more experienced by the time of the next wave. Our study is based on a balanced experiment in which both interviewer continuity and experience are controlled. Multilevel multiple membership models are used to explore the effects of interviewer continuity on refusal rate as well as interactions of interviewer continuity with other variables. We find that continuity reduces refusal propensity for younger respondents but not for older respondents, and that this effect depends on the age of the interviewer. This supports the notion that interviewer continuity may be beneficial in some situations, but not necessarily in others.


Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique | 2016

Eye-tracking Social Desirability Bias

Olena Kaminska; Tom Foulsham

Eye tracking is now a common technique studying the moment-by-moment cognition of those processing visual information. Yet this technique has rarely been applied to different survey modes. Our paper uses an innovative method of real-world eye tracking to look at attention to sensitive questions and response scale points, in Web, face-to-face and paper-and-pencil self-administered (SAQ) modes. We link gaze duration to responses in order to understand how respondents arrive at socially desirable or undesirable answers. Our novel technique sheds light on how social desirability biases arise from deliberate misreporting and/or satisficing, and how these vary across modes.


Journal of Official Statistics | 2017

The implications of alternative allocation criteria in adaptive design for panel surveys

Olena Kaminska; Peter Lynn

Abstract Adaptive survey designs can be used to allocate sample elements to alternative data collection protocols in order to achieve a desired balance between some quality measure and survey costs. We compare four alternative methods for allocating sample elements to one of two data collection protocols. The methods differ in terms of the quality measure that they aim to optimize: response rate, R-indicator, coefficient of variation of the participation propensities, or effective sample size. Costs are also compared for a range of sample sizes. The data collection protocols considered are CAPI single-mode and web-CAPI sequential mixed-mode. We use data from a large experiment with random allocation to one of these two protocols. For each allocation method we predict outcomes in terms of several quality measures and costs. Although allocating the whole sample to single-mode CAPI produces a higher response rate than allocating the whole sample to the mixed-mode protocol, we find that two of the targeted allocations achieve a better response rate than single-mode CAPI at a lower cost. We also find that all four of the targeted designs out-perform both single-protocol designs in terms of representativity and effective sample size. For all but the smallest sample sizes, the adaptive designs bring cost savings relative to CAPI-only, though these are fairly modest in magnitude.


Public Opinion Quarterly | 2010

Satisficing among reluctant respondents in a cross -national context

Olena Kaminska; Allan L. McCutcheon; Jaak Billiet


Public Opinion Quarterly | 2013

The Impact of Mobile Phones on Survey Measurement Error

Peter Lynn; Olena Kaminska


Archive | 2012

Using EU-SILC data for cross-national analysis: strengths, problems and recommendations

Maria Iacovou; Olena Kaminska; Horacio Levy


Public Opinion Quarterly | 2010

Recruiting Probability Samples for a Multi-Mode Research Panel with Internet and Mail Components

Kumar Nagaraja Rao; Olena Kaminska; Allan L. McCutcheon


Archive | 2012

An initial look at non-response and attrition in Understanding Society

Peter Lynn; Jonathan Burton; Olena Kaminska; Gundi Knies; Alita Nandi


Archive | 2013

Understanding Society Innovation Panel Wave 5: Results from Methodological Experiments

Mathew Creighton; Jennifer Dykema; Alessandra Gaia; Alexandru Cernat; Dana Garbarski; Amaney Jamal; Olena Kaminska; Florian Keusch; Peter Lynn; Daniel Oberski; Nora Cate Schaeffer; Sc Noah Uhrig; Ting Yan

Collaboration


Dive into the Olena Kaminska's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Allan L. McCutcheon

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dana Garbarski

Loyola University Chicago

View shared research outputs
Top Co-Authors

Avatar

Jennifer Dykema

University of Wisconsin–Platteville

View shared research outputs
Top Co-Authors

Avatar

Nora Cate Schaeffer

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Ting Yan

University of Michigan

View shared research outputs
Researchain Logo
Decentralizing Knowledge