Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andy Peytchev is active.

Publication


Featured researches published by Andy Peytchev.


Social Science Computer Review | 2010

Experiments in Mobile Web Survey Design

Andy Peytchev; Craig A. Hill

Self-administered surveys can be conducted on mobile web-capable devices, yet these devices have unique features that can affect response processes. Ninety-two adults were randomly selected and provided with mobile devices to complete weekly web surveys. Experiments were designed to address three main objectives. First, the authors test fundamental findings which have been found robust across other modes, but whose impact may be diminished in mobile web surveys (due largely to the device), by manipulating question order and scale frequencies. Second, the authors test findings from experiments in computer-administered web surveys, altering the presentation of images and the number of questions per page. Third, the authors experiment with the unique display, navigation, and input methods, through the need to scroll, the vertical versus horizontal orientation of scales, and the willingness to provide open-ended responses. Although most findings from other modes are upheld, the small screen and keyboard introduce undesirable differences in responses.


Social Science Computer Review | 2005

A Typology of Real-Time Validations in Web-Based Surveys

Andy Peytchev; Scott D. Crawford

One area in computer-assisted self-interviewing (CASI) where methodological and empirical research have not caught up with technological advancements is real-time validation of respondent input. Some of the literature on computer-assisted interviewing and postsurvey editing is pertinent yet far insufficient—the largest component that could not be deduced being the respondent interaction. This leads to the employment or avoidance of validations by current practitioners based on untested assumptions. This article presents a model that breaks down the different elements in survey validation by type of computation, implementation, and interaction with the respondent. The purpose is twofold: to describe what possibilities exist in validations for web-based surveys and to present a framework in which systematic research could be conducted to evaluate the impact of validations on survey costs and errors. Relevant findings from prior literature are discussed in this context.


Field Methods | 2011

The Effects of Differential Interviewer Incentives on a Field Data Collection Effort

Jeffrey Rosen; Joe Murphy; Andy Peytchev; Sarah Riley; Mark R. Lindblad

Surveys routinely offer incentives to motivate respondents and increase the likelihood of their participation, yet surprisingly little is known about the effectiveness of interviewer incentives. If interviewer incentives increase interviewers’ success in gaining cooperation, they could help address declining survey response rates. In this article, we present the results of an experiment testing the effectiveness of interviewer incentives in the form of cash bonuses for each successfully completed field interview. We did not find evidence that higher payments to interviewers for each completion led to increased effort on the part of interviewers nor did they lead to higher levels of success in securing respondent cooperation. These findings suggest that per complete interviewer incentives may not be cost effective in reducing survey nonresponse.


Social Science Computer Review | 2010

Coverage Bias in Variances, Associations, and Total Error From Exclusion of the Cell Phone-Only Population in the United States

Andy Peytchev; Lisa R. Carley-Baxter; Michele C. Black

Although landline telephone household surveys often draw inference about the general population, a proportion with only cell phones is excluded. In the United States, like in much of the world, this proportion is substantial and increasing, providing potential for coverage bias. Studies have looked at bias in means, but undercoverage can affect other essential statistics. The precision of point estimates can be biased, leading to erroneous conclusions. Research examining multivariate relationships will be further affected by bias in associations. A national landline telephone survey was conducted, followed by a survey of adults with only cell phones. In addition to estimates of means and proportions, differences were found for variances and associations. Bias in some point estimates was reduced through poststratification but became larger and in opposite direction for others. Different uses of survey data can be affected by omitting the cell-only population, and reliance on postsurvey adjustments can be misleading.


Field Methods | 2016

Financial Record Checking in Surveys Do Prompts Improve Data Quality

Joe Murphy; Jeffrey Rosen; Ashley Richards; Sarah Riley; Andy Peytchev; Mark R. Lindblad

Self-reports of financial information in surveys, such as wealth, income, and assets, are particularly prone to inaccuracy. We sought to improve the quality of financial information captured in a survey conducted by phone and in person by encouraging respondents to check records when reporting on income and assets. We investigated whether suggestive prompts influenced unit response, compliance with the request to check records, precision of estimates, and accuracy. We conducted a split sample experiment in the Community Advantage Panel Survey in which half of telephone respondents and half of in-person household interview respondents were encouraged to check the records. We found a modest positive effect of prompts on compliance but no effect on unit response, precision, or accuracy.


Public Opinion Quarterly | 2006

Web Survey Design Paging versus Scrolling

Andy Peytchev; Mick P. Couper; Sean Esteban McCabe; Scott D. Crawford


Journal of Medical Internet Research | 2007

Following up nonrespondents to an online weight management intervention: Randomized trial comparing mail versus telephone

Mick P. Couper; Andy Peytchev; Victor J. Strecher; Kendra Rothert; J. Anderson


Public Opinion Quarterly | 2007

Effect of Interviewer Experience on Interview Pace and Interviewer Attitudes

Kristen Olson; Andy Peytchev


Public Opinion Quarterly | 2009

Not All Survey Effort is Equal Reduction of Nonresponse Bias and Nonresponse Error

Andy Peytchev; Rodney K. Baxter; Lisa R. Carley-Baxter


Public Opinion Quarterly | 2010

Measurement Error, Unit Nonresponse, and Self-Reports of Abortion Experiences

Andy Peytchev; Emilia Peytcheva; Robert M. Groves

Collaboration


Dive into the Andy Peytchev's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark R. Lindblad

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sarah Riley

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kristen Olson

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

Paul P. Biemer

University of North Carolina at Chapel Hill

View shared research outputs
Researchain Logo
Decentralizing Knowledge