Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jennifer Hunter Childs is active.

Publication


Featured researches published by Jennifer Hunter Childs.


Field Methods | 2006

Analyzing Interviewer/Respondent Interactions while Using a Mobile Computer-Assisted Personal Interview Device

Jennifer Hunter Childs; Ashley Landreth

This article explores interviewer/respondent interactions using a handheld computer (HHC) and identifies issues arising from using a self-administered paper form to create a mobile computer-assisted personal interview (CAPI) instrument. To evaluate the success of this instrument, the authors used the behavior-coding method to evaluate a sample of about 220 audiotaped interviews to identify survey questions that cause problems at the administration and/or response stage. This article explores problems with the interview, as it was conductedon an HHC, both at a question level and overall. The objective of this study was to investigate whether the automation of this survey instrument encouraged standardized interviewing procedures by examining how interviewers read the questions and how respondents answer them. The authors discuss mode-specific problems that arose in the interview and proposesuggestionsfor futuresurveys thatuse a similar methodof datacollection.


Field Methods | 2009

Respondent Debriefings Conducted by Experts: A Technique for Questionnaire Evaluation

Elizabeth Nichols; Jennifer Hunter Childs

This article explores the use of expert respondent debriefings to evaluate the quality of survey data. In this case study, subject matter experts observed 169 interviews and conducted qualitative respondent debriefings on select cases in a field test of a census coverage survey. By comparing “true” residence status for each person as determined by the debriefing against the residence status obtained by the questionnaire alone, the authors determined whether the questionnaire was collecting accurate information. For the 473 people for whom survey data was available, the questionnaire failed only five times to assign the correct residence status code in a way that would have detrimentally affected coverage estimates. The respondent debriefing technique helped pinpoint specific problems in the questionnaire, as well as confirm that the questionnaire performed adequately in most cases. This article describes the expert respondent debriefing methodology that was used in a face-to-face interview setting and discusses how it could be adapted to telephone interviewing.


Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique | 2011

Measuring Race and Hispanic Origin: Cognitive Test Findings Searching for ‘‘Truth’’

Jennifer Hunter Childs; Rodney Terry; Nathan Jurgenson

Mesurer la race et les origines hispaniques : Résultats de tests cognitifs à la recherche de la « vérité » : Cette recherche présente une tentative de l’US Census Bureau pour créer une norme, ‘‘étalon-or’’, pour l’appréciation des auto-déclaratin de race afin d’évaluer la performance d’un groupe expérimental de questions portant sur la race et les origines hispaniques. Pour ce faire, le sujet est interrogé sur sa race de trois façons : (1) une question ouverte qui permet à l’enquêté de s’auto-identifier avec n’importe quelle race ou origines hispaniques; (2) une série de questions oui/non destinées à mesurer l’identification avec des catégories de race du gouvernement des États-Unis et d’origine hispanique ; et (3) une mesure de synthèse qui tente de recueillir les réponses habituelles ou typiques de l’enquêté aux questions portant sur la race et l’origine hispanique. Nous soutenons qu’aucune mesure prise isolément ne saisit la « vérité » de race, mais qu’il est nécessiare d’utiliser les trois mesures ensemble pour obtenir une réponse solide de l’auto-déclaration de race et d’origine hispanique pour presque tous les répondants de notre échantillon. This research describes an attempt by USCensus Bureau staff to create a ‘‘gold standard’’ assessment of the ‘‘truth’’ of self-identified race in order to evaluate the performance of an experimental panel of race and Hispanic origin questions. This gold standard is achieved by asking about race in three ways: (1) an open-ended question that allows the respondent to self-identify with any races or Hispanic origins; (2) a series of yes/no questions aimed at measuring identification with the US government’s race and Hispanic origin categories;2 and (3) a summary measure which attempts to gather the respondent’s usual or typical report to race and Hispanic origin questions. We argue that while no single measure taken alone captures the truth of race, all the three measures, taken together, do provide a robust portrait of self-identified race and Hispanic origin for nearly all respondents in our sample.


Bulletin of Sociological Methodology/Bulletin de Méthodologie Sociologique | 2017

Exploring Inconsistent Counts of Racial/Ethnic Minorities in a 2010 Census Ethnographic Evaluation:

Rodney Terry; Laurie Schwede; Ryan King; Mandi Martinez; Jennifer Hunter Childs

Previous research has shown differential counts by race and ethnicity across several recent United States decennial censuses. This article presents findings from a 2010 Census ethnographic evaluation with a record check, conducted to identify factors affecting enumeration among racial/ethnic groups. In eight sites targeted to major racial/ethnic groups, ethnographers observed live census interviews and assessed where persons should have been counted. In the record check, housing unit rosters were matched with four data sources to identify inconsistencies in where to count persons. Ethnographic themes that contributed to record check inconsistencies include respondent access difficulty, language issues, and cultural issues. Ways to improve enumeration include improving access to hard-to-reach respondents and increasing the cultural awareness of enumerators.


Social Change | 2016

Soziale Mobilität in der Schweiz im 20. Jahrhundert: zwischen Demokratisierung der Bildung und Fortbestand der Klassenungleichheiten

Mary H. Mulry; Elizabeth Nichols; Jennifer Hunter Childs

Correctly recalling where someone lived as of a particular date is critical to the accuracy of the once-a-decade U.S. decennial census. The data collection period for the 2010 Census occurred over the course of a few months: February to August, with some evaluation operations occurring up to 7 months after that. The assumption was that respondents could accurately remember moves and move dates on and around April 1st up to 11 months afterwards. We show how statistical analyses can be used to investigate the validity of this assumption by comparing self-reports and proxy-reports of the month of a move in a U.S. Census Bureau survey with an administrative records database from the U.S. Postal Service containing requests to forward mail filed in March and April of 2010. In our dataset, we observed that the length of time since the move affects memory error in reports of a move and the month of a move. Also affecting memory error of moves is whether the respondent is reporting for themselves or another person in the household . This case study is relevant to surveys as well as censuses because move dates and places of residence often serve as anchors to aid memory of other events in questionnaires.


Public Opinion Quarterly | 2014

Social Media in Public Opinion Research Executive Summary of the Aapor Task Force on Emerging Technologies in Public Opinion Research

Joe Murphy; Michael W. Link; Jennifer Hunter Childs; Casey Langer Tesfaye; Elizabeth Dean; Michael J. Stern; Josh Pasek; Jon Cohen; Mario Callegaro; Paul Harwood


Public Opinion Quarterly | 2014

Mobile Technologies for Conducting, Augmenting and Potentially Replacing Surveys Executive Summary of the AAPOR Task Force on Emerging Technologies in Public Opinion Research

Michael W. Link; Joe Murphy; Michael F. Schober; Trent D. Buskirk; Jennifer Hunter Childs; Casey Langer Tesfaye


Archive | 2014

Social media in public opinion research: Report of the AAPOR task force on emerging technologies in public opinion research

Joseph Murphy; Michael W. Link; Jennifer Hunter Childs; Casey Langer Tesfaye; Elizabeth Dean; Michael J. Stern; Josh Pasek; Jon Cohen; Mario Callegaro; Paul Harwood


Survey practice | 2013

The Efficiency of Conducting Concurrent Cognitive Interviewing and Usability Testing on an Interviewer-Administered Survey

Jennifer C. Romano Bergstrom; Jennifer Hunter Childs; Erica L. Olmsted-Hawala; Nathan Jurgenson


Archive | 2008

2006 Questionnaire Design and Experimental Research Survey: Demographic Questions Analysis

Elizabeth Nichols; Jennifer Hunter Childs; Rolando Rodríguez; Aref Dajani; Jennifer Rothgeb

Collaboration


Dive into the Jennifer Hunter Childs's collaboration.

Top Co-Authors

Avatar

Elizabeth Nichols

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar

Nathan Jurgenson

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Josh Pasek

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rodney Terry

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar

Mary H. Mulry

United States Census Bureau

View shared research outputs
Researchain Logo
Decentralizing Knowledge