Elizabeth Nichols
United States Census Bureau
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Elizabeth Nichols.
Field Methods | 2009
Elizabeth Nichols; Jennifer Hunter Childs
This article explores the use of expert respondent debriefings to evaluate the quality of survey data. In this case study, subject matter experts observed 169 interviews and conducted qualitative respondent debriefings on select cases in a field test of a census coverage survey. By comparing “true” residence status for each person as determined by the debriefing against the residence status obtained by the questionnaire alone, the authors determined whether the questionnaire was collecting accurate information. For the 473 people for whom survey data was available, the questionnaire failed only five times to assign the correct residence status code in a way that would have detrimentally affected coverage estimates. The respondent debriefing technique helped pinpoint specific problems in the questionnaire, as well as confirm that the questionnaire performed adequately in most cases. This article describes the expert respondent debriefing methodology that was used in a face-to-face interview setting and discusses how it could be adapted to telephone interviewing.
human factors in computing systems | 2017
Lin Wang; Christopher Antoun; Russell Sanders; Elizabeth Nichols; Erica L. Olmsted-Hawala; Brian Falcone; Ivonne J. Figueroa; Jonathan Katz
With the growing use of smartphones, many surveys can now be administered using those phones. Such questionnaires are called mobile survey questionnaires. The designer of a mobile survey questionnaire is challenged with presenting text and controls on a small display, while allowing respondents to correctly understand and answer questions with ease. To address this challenge, we are developing an evidence-based framework of user interface design for mobile survey questionnaires. The framework includes two parts: standards for the basic elements of survey-relevant mobile device operation and guidelines for the building blocks of mobile survey questionnaires. In this presentation, we will describe five behavioral experiments designed to collect evidence for developing the standards. These experiments cover visual perception and motor actions relevant to survey completion. Some preliminary results from ongoing data collection are presented.
international conference on universal access in human-computer interaction | 2014
Erica L. Olmsted-Hawala; Temika Holland; Elizabeth Nichols
In a study of the American Community Survey online instrument, we assessed how people answered questions about themselves and other individuals living in their household using eye-tracking data and other qualitative measures. This paper focuses on the number of fixations (whether participants looked at specific areas of the screen), fixation duration (how long participants looked at the questions and answers), and number of unique visits (whether participants rechecked the question and answer options). Results showed that for age, date of birth and employment duty questions participants had more fixations and unique visit counts, and spent more time on the screen when answering about unrelated members of their household than when answering about themselves. Differing eye movements for proxy reporting suggest that answering some survey questions for other unrelated people poses more burden on respondents than answering about oneself. However, not all questions showed this tendency, so eye tracking alone is not enough to detect burden.
international conference on human aspects of it for aged population | 2018
Erica L. Olmsted-Hawala; Elizabeth Nichols; Brian Falcone; Ivonne J. Figueroa; Christopher Antoun; Lin Wang
Growing numbers of people are using their mobile phones to respond to online surveys. As a result, survey designers face the challenge of displaying questions and their response options and navigation elements on small smartphone screens in a way that encourages survey completion. The purpose of the present study was to conduct a series of systematic assessments of how older adults using smartphones interact with different user-interface features in online surveys. This paper shares results of three different experiments. Experiment 1 compares different ways of displaying choose-one response options. Experiment 2 compares different ways of displaying numeric entry boxes, specifically ones used to collect currency information (e.g., prices, costs, salaries). Experiment 3 tests whether forward and backward navigational buttons on a smartphone survey should be labeled with words (previous, next) or simply indicated with arrow icons ( ). Results indicate that certain features such as picker-boxes that appear at the bottom of the screen (iOS devices), fixed formatting of numeric-entry boxes, and icon navigation buttons were problematic. They either had negative impacts on performance (response times and/or accuracy) or only a small percentage of participants preferred these design features when asked to compare them to the other features.
Social Science Computer Review | 2018
Erica L. Olmsted-Hawala; Elizabeth Nichols
In 2016, the U.S. Census Bureau conducted a split-panel experiment to explore the public’s willingness to share geolocation information within a survey. A sample of participants from a nonprobability panel were invited to take part in an online survey using their mobile device. Within the survey, one question asked for their address and then the survey requested permission to access their geolocation information. Depending on the study condition, the survey varied how the geolocation request was made and where in the survey the address and geolocation requests appeared. Results showed that the treatment that explicitly asked for permission in addition to the device’s default permission request increased female respondents’ sharing of that data but not male respondents’ sharing. Results also showed that placing the address and geolocation request toward the end of the survey significantly increased the willingness of all respondents to share their location information. Results indicated that respondents with more education and nonminority respondents were more willing to share their location data, but willingness to share location data did not depend on age of the respondent. Assuming that the respondents reported truthfully that they were at home while taking the survey and entered their home address, we found the geolocation data to be accurate to the correct block a little more than 50% of the time.
Social Change | 2016
Mary H. Mulry; Elizabeth Nichols; Jennifer Hunter Childs
Correctly recalling where someone lived as of a particular date is critical to the accuracy of the once-a-decade U.S. decennial census. The data collection period for the 2010 Census occurred over the course of a few months: February to August, with some evaluation operations occurring up to 7 months after that. The assumption was that respondents could accurately remember moves and move dates on and around April 1st up to 11 months afterwards. We show how statistical analyses can be used to investigate the validity of this assumption by comparing self-reports and proxy-reports of the month of a move in a U.S. Census Bureau survey with an administrative records database from the U.S. Postal Service containing requests to forward mail filed in March and April of 2010. In our dataset, we observed that the length of time since the move affects memory error in reports of a move and the month of a move. Also affecting memory error of moves is whether the respondent is reporting for themselves or another person in the household . This case study is relevant to surveys as well as censuses because move dates and places of residence often serve as anchors to aid memory of other events in questionnaires.
Archive | 2008
Elizabeth Nichols; Jennifer Hunter Childs; Rolando Rodríguez; Aref Dajani; Jennifer Rothgeb
Archive | 2017
Brady T. West; Antje Kirchner; Daniela Hochfellner; Stefan Bender; Elizabeth Nichols; Mary H. Mulry; Jennifer Hunter Childs; Anders Holmberg; Christine Bycroft; Grant Benson; Frost Hubbard
Archive | 2011
Elizabeth Nichols; Jennifer Hunter Childs; Rolando Rodríguez
Archive | 2008
Elizabeth Nichols; Jennifer Hunter Childs; Kyra Linse