Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert M. Groves is active.

Publication


Featured researches published by Robert M. Groves.


Public Opinion Quarterly | 1992

UNDERSTANDING THE DECISION TO PARTICIPATE IN A SURVEY

Robert M. Groves; Robert B. Cialdini; Mick P. Couper

The lack of full participation in sample surveys threatens the inferential value of the survey method. We review a set of conceptual developments and experimental findings that appear to be informative about causes of survey participation; offer an integration of that work with findings from the more traditional statistical and survey methodological literature on nonresponse; and, given the theoretical structure, deduce potentially promising paths of research toward the understanding of survey participation.


Contemporary Sociology | 1993

Measurement Errors in Surveys.

Judith M. Tanur; Paul P. Biemer; Robert M. Groves; Lars E. Lyberg; Nancy A. Mathiowetz; Seymour Sudman

Partial table of contents: THE QUESTIONNAIRE. The Current Status of Questionnaire Design (N. Bradburn & S. Sudman). Context Effects in the General Social Survey (T. Smith). RESPONDENTS AND RESPONSES. Recall Error: Sources and Bias Reduction Techniques (D. Eisenhower, et al.). Toward a Response Model in Establishment Surveys (W. Edwards & D. Cantor). INTERVIEWERS AND OTHER MEANS OF DATA COLLECTION. The Design and Analysis of Reinterview: An Overview (G. Forsman & I. Schreiner). Expenditure Diary Surveys and Their Associated Errors (A. Silberstein & S. Scott). MEASUREMENT ERRORS IN THE INTERVIEW PROCESS. Cognitive Laboratory Methods: A Taxonomy (B. Forsyth & J. Lessler). The Effect of Interviewer and Respondent Characteristics on the Quality of Survey Data: A Multilevel Model (J. Hox, et al.). MODELING MEASUREMENT ERRORS AND THEIR EFFECTS ON ESTIMATION AND DATA ANALYSIS. Approaches to the Modeling of Measurement Errors (P. Biemer & S. Stokes). Evaluation of Measurement Instruments Using a Structural Modeling Approach (W. Saris & F. Andrews). Chi-Squared Tests with Complex Survey Data Subject to Misclassification Error (J. Rao & D. Thomas). References. Index.


Public Opinion Quarterly | 1986

Measuring and Explaining Interviewer Effects in Centralized Telephone Surveys

Robert M. Groves; Lou J. Magilavy

Estimates of interviewer effects on survey statistics are examined from nine surveys conducted over a six-year period at the Survey Research Center. Estimates of intraclass correlations associated with interviewers are found to be unstable, given the number of interviewers (30-40) used on most surveys. This finding calls into question inference from earlier studies of interviewer effects. To obtain more reliable information about magnitudes of interviewer effects, generalized effects are constructed by cumulating estimates over statistics and surveys. These generalized correlations are found to be somewhat smaller than those reported in the past literature. Few differences in generalized interviewer effect measures are found between open and closed questions or between factual and attitudinal questions. Small reductions in effects were obtained when a Computer Assisted Telephone Interviewing (CATI) system was used; there was some evidence of elderly respondents being more susceptible to interviewer effects; the number and type of second responses to open questions were affected by interviewer behavior; and changes in interviewing techniques reduced interviewer effects. Robert M. Groves is an Associate Research Scientist in the Survey Research Center and an Associate Professor in the Department of Sociology at The University of Michigan. Lou J. Magilavy is a Ph.D. candidate in the Department of Biostatistics at The University of Michigan. Data in this article were collected in projects supported by the National Science Foundation and the National Center for Health Statistics. The views in this article do not necessarily represent those of either organization. An earlier version of this article was presented at the meetings of the International Statistical Institute, 1983. The authors appreciate the comments of Leslie Kish, Stanley Presser, and Howard Schuman on earlier versions, and extend their gratitude to Richard Curtin for access to the data from the Surveys of Consumer Attitudes. Public Opinion Quarterly Vol. 50:251-266 ? 1986 by the American Association for Public Opinion Research Published by The University of Chicago Press 0033-362X/86/0050-251/


Public Opinion Quarterly | 1999

Differential Incentives: Beliefs about Practices, Perceptions of Equity, and Effects on Survey Participation

Eleanor Singer; Robert M. Groves; Amy Corning

2.50 This content downloaded from 207.46.13.33 on Sun, 20 Nov 2016 04:16:07 UTC All use subject to http://about.jstor.org/terms 252 ROBERT M. GROVES AND LOU J. MAGILAVY marizes six years of research in the measurement of interviewer effects in centralized telephone surveys, on topics ranging from health experiences to political attitudes. We attempt to address two weaknesses in past research-problems of estimation (obtaining precise and accurate estimates of interviewer effects) and problems of explanation (learning what design features affect the magnitude of interviewer effects).


Sociological Methods & Research | 1985

Gender Effects among Telephone Interviewers in a Survey of Economic Attitudes

Robert M. Groves; Nancy H. Fultz

In an effort to counter mounting problems of noncooperation (De Heer and Israels 1992; Groves and Couper 1996), survey organizations are increasingly offering incentives to respondents, sometimes before or during the first request for survey participation. This has traditionally been done in mail surveys, and sometimes only after the person has refused, in an attempt to convert the refusal. In the case of mail surveys, the payment of incentives is one of two design factors that consistently and substantially increase the response rate, the other being the number of contacts (Church 1993; Heberlein and Baumgartner 1978; Yu and Cooper 1983). Incentives are similarly effective in face-to-face and telephone surveys (Singer et al. 1999). There appear to be no deleterious effects of incentives on the quality of survey responses, though further research is needed in this area. Despite these findings, concerns persist about possible unintended consequences of the use of incentives (Singer, Van Hoewyk, and Maher 1998). One concern is that the use of incentives to convert refusals will be perceived as inequitable by cooperative respondents, and, if they learn of the practice, it will adversely affect their attitudes toward surveys and their willingness to cooperate in future surveys (Kulka 1994). This unintended consequence is the focus of the present study. The aim of the study is twofold: to explore the publics reactions to equity issues raised by the use of incentives, and to investigate the effect of such reactions on peoples willingness to participate in surveys. Since many survey organiza-


American Journal of Public Health | 1985

The effects of respondent rules on health survey reports.

Nancy A. Mathiowetz; Robert M. Groves

Male and female telephone interviewers are compared on both administrative efficiency and data quality, using data from 24 replications of an attitudinal survey on personal and national economic prospects. The 40 male interviewers used over the two-year period are found to exhibit higher turnover rates and, because of that, lower response rates and higher training costs than the 80 female interviewers. However, there are no real differences on the total per minute interview costs by gender, in missing data rates, or on response distributions for factual questions. There does appear to be a systematic tendency for male interviewers to obtain more optimistic reports from respondents regarding their economic outlook. Multivariate models are constructed that attempt to explain these results and speculations are offered about causes of the impact of interviewer gender on response formation.


Quality & Quantity | 1996

Social environmental impacts on survey cooperation

Mick P. Couper; Robert M. Groves

Survey researchers believe that self reports, in general, are more accurate than reports obtained by proxy. This paper focuses on the reassessment of previous self/proxy comparisons and presents findings from a telephone adaptation of the National Health Interview Survey (NHIS) designed to investigate response error associated with self and proxy reports. Unlike previous studies in which the type of report is confounded with characteristics of the population home at the time of the interview, the design of this study (random allocation to self or proxy report) allows comparison of reports from similar populations. The results show that when self response is limited to a randomly selected respondent, the self respondents report fewer health events for themselves versus for others in their household.


Demography | 2011

Responsive Survey Design, Demographic Data Collection, and Models of Demographic Behavior

William G. Axinn; Cynthia F. Link; Robert M. Groves

Social environmental influences on survey cooperation are explored using data from six national household surveys in the United States matched to 1990 decennial census data. Consistent with the past literature on prosocial behavior, cooperation rates in these six surveys are found to be lower in urban, densely populated, high crime rate areas. Measures of social cohesion show no evidence of influencing cooperation. The influence of the environmental variables is then observed after introducing statistical controls for household structure, race, age of household members, presence of children, and socioeconomic attributes of households. Over half of the measured influence of the environmental variables is explained by these household-level attributes. These findings have practical import for survey administrators and are informative for the construction of a theory of survey participation.


Journal of the American Statistical Association | 1986

A mean squared error model for dual frame, mixed mode survey design

James M. Lepkowski; Robert M. Groves

To address declining response rates and rising data-collection costs, survey methodologists have devised new techniques for using process data (“paradata”) to address nonresponse by altering the survey design dynamically during data collection. We investigate the substantive consequences of responsive survey design—tools that use paradata to improve the representative qualities of surveys and control costs. By improving representation of reluctant respondents, responsive design can change our understanding of the topic being studied. Using the National Survey of Family Growth Cycle 6, we illustrate how responsive survey design can shape both demographic estimates and models of demographic behaviors based on survey data. By juxtaposing measures from regular and responsive data collection phases, we document how special efforts to interview reluctant respondents may affect demographic estimates. Results demonstrate the potential of responsive survey design to change the quality of demographic research based on survey data.


Social Science Research | 1978

On the mode of administering a questionnaire and responses to open-ended items

Robert M. Groves

Abstract An error model for dual frame survey designs is developed. It includes components of error for sampling variance, interviewer variance, and bias in each frame. A cost model that attempts to capture the complexity of a full scale dual frame survey is presented. The error and cost models are applied to a large national survey, the National Crime Survey, and the effect that alternative levels of bias in both frames have on the optimal allocation of sample to the two frames is examined for two types of crime.

Collaboration


Dive into the Robert M. Groves's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

William D. Mosher

Centers for Disease Control and Prevention

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge