Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christopher Antoun is active.

Publication


Featured researches published by Christopher Antoun.


PLOS ONE | 2015

Precision and Disclosure in Text and Voice Interviews on Smartphones

Michael F. Schober; Frederick G. Conrad; Christopher Antoun; Patrick Ehlen; Stefanie Fail; Andrew L. Hupp; Michael V. Johnston; Lucas Vickers; H. Yanna Yan; Chan Zhang

As people increasingly communicate via asynchronous non-spoken modes on mobile devices, particularly text messaging (e.g., SMS), longstanding assumptions and practices of social measurement via telephone survey interviewing are being challenged. In the study reported here, 634 people who had agreed to participate in an interview on their iPhone were randomly assigned to answer 32 questions from US social surveys via text messaging or speech, administered either by a human interviewer or by an automated interviewing system. 10 interviewers from the University of Michigan Survey Research Center administered voice and text interviews; automated systems launched parallel text and voice interviews at the same time as the human interviews were launched. The key question was how the interview mode affected the quality of the response data, in particular the precision of numerical answers (how many were not rounded), variation in answers to multiple questions with the same response scale (differentiation), and disclosure of socially undesirable information. Texting led to higher quality data—fewer rounded numerical answers, more differentiated answers to a battery of questions, and more disclosure of sensitive information—than voice interviews, both with human and automated interviewers. Text respondents also reported a strong preference for future interviews by text. The findings suggest that people interviewed on mobile devices at a time and place that is convenient for them, even when they are multitasking, can give more trustworthy and accurate answers than those in more traditional spoken interviews. The findings also suggest that answers from text interviews, when aggregated across a sample, can tell a different story about a population than answers from voice interviews, potentially altering the policy implications from a survey.


Field Methods | 2016

Comparisons of Online Recruitment Strategies for Convenience Samples Craigslist, Google AdWords, Facebook, and Amazon Mechanical Turk

Christopher Antoun; Chan Zhang; Frederick G. Conrad; Michael F. Schober

The rise of social media websites (e.g., Facebook) and online services such as Google AdWords and Amazon Mechanical Turk (MTurk) offers new opportunities for researchers to recruit study participants. Although researchers have started to use these emerging methods, little is known about how they perform in terms of cost efficiency and, more importantly, the types of people that they ultimately recruit. Here, we report findings about the performance of four online sources for recruiting iPhone users to participate in a web survey. The findings reveal very different performances between two types of strategies: those that “pull in” online users actively looking for paid work (MTurk workers and Craigslist users) and those that “push out” a recruiting ad to online users engaged in other, unrelated online activities (Google AdWords and Facebook). The pull-method recruits were more cost efficient and committed to the survey task, while the push-method recruits were more demographically diverse.


human factors in computing systems | 2017

Experimentation for Developing Evidence-Based UI Standards of Mobile Survey Questionnaires

Lin Wang; Christopher Antoun; Russell Sanders; Elizabeth Nichols; Erica L. Olmsted-Hawala; Brian Falcone; Ivonne J. Figueroa; Jonathan Katz

With the growing use of smartphones, many surveys can now be administered using those phones. Such questionnaires are called mobile survey questionnaires. The designer of a mobile survey questionnaire is challenged with presenting text and controls on a small display, while allowing respondents to correctly understand and answer questions with ease. To address this challenge, we are developing an evidence-based framework of user interface design for mobile survey questionnaires. The framework includes two parts: standards for the basic elements of survey-relevant mobile device operation and guidelines for the building blocks of mobile survey questionnaires. In this presentation, we will describe five behavioral experiments designed to collect evidence for developing the standards. These experiments cover visual perception and motor actions relevant to survey completion. Some preliminary results from ongoing data collection are presented.


Social Science Computer Review | 2018

Design Heuristics for Effective Smartphone Questionnaires

Christopher Antoun; Jonathan Katz; Josef Argueta; Lin Wang

Design principles for survey questionnaires viewed on desktop and laptop computers are increasingly being seen as inadequate for the design of questionnaires viewed on smartphones. Insights gained from empirical research can help those conducting mobile surveys to improve their questionnaires. This article reports on a systematic literature review of research presented or published between 2007 and 2016 that evaluated the effect of smartphone questionnaire design features on indicators of response quality. The evidence suggests that survey designers should make efforts to “optimize” their questionnaires to make them easier to complete on smartphones, fit question content to the width of smartphone screens to prevent horizontal scrolling, and choose simpler types of questions (single-choice questions, multiple-choice questions, text-entry boxes) over more complicated types of questions (large grids, drop boxes, slider questions). Based on these results, we identify design heuristics, or general principles, for creating effective smartphone questionnaires. We distinguish between five of them: readability, ease of selection, visibility across the page, simplicity of design elements, and predictability across devices. They provide an initial framework by which to evaluate smartphone questionnaires, though empirical testing and further refinement of the heuristics is necessary.


international conference on human aspects of it for aged population | 2018

Optimal Data Entry Designs in Mobile Web Surveys for Older Adults

Erica L. Olmsted-Hawala; Elizabeth Nichols; Brian Falcone; Ivonne J. Figueroa; Christopher Antoun; Lin Wang

Growing numbers of people are using their mobile phones to respond to online surveys. As a result, survey designers face the challenge of displaying questions and their response options and navigation elements on small smartphone screens in a way that encourages survey completion. The purpose of the present study was to conduct a series of systematic assessments of how older adults using smartphones interact with different user-interface features in online surveys. This paper shares results of three different experiments. Experiment 1 compares different ways of displaying choose-one response options. Experiment 2 compares different ways of displaying numeric entry boxes, specifically ones used to collect currency information (e.g., prices, costs, salaries). Experiment 3 tests whether forward and backward navigational buttons on a smartphone survey should be labeled with words (previous, next) or simply indicated with arrow icons ( ). Results indicate that certain features such as picker-boxes that appear at the bottom of the screen (iOS devices), fixed formatting of numeric-entry boxes, and icon navigation buttons were problematic. They either had negative impacts on performance (response times and/or accuracy) or only a small percentage of participants preferred these design features when asked to compare them to the other features.


Psychiatric Services | 2018

The Michigan Peer-to-Peer Depression Awareness Program: School-Based Prevention to Address Depression Among Teens

Sagar V. Parikh; Danielle S. Taubman; Christopher Antoun; James Cranford; Cynthia Ewell Foster; Mary Grambeau; Joyce Hunter; Jennifer Jester; Kristine Konz; Trish Meyer; Stephanie Salazar; John F. Greden

OBJECTIVE The Peer-to-Peer Depression Awareness Program (P2P) is a school-based program that aims to decrease mental illness and promote well-being among students by empowering high school students as both learners and educators. Specific goals include improving the school climate around mental health, directing students to resources, and encouraging help-seeking behavior. METHODS In the 2015-2016 academic year, 121 students across 10 high schools organized into teams and were trained to develop and implement peer-to-peer depression awareness campaigns. Outcomes were assessed via pre- and posttest questionnaires. RESULTS A total of 878 students completed questionnaires. Outcomes demonstrated improved knowledge and attitudes toward depression, increased confidence in identifying and referring peers with depression, improved help-seeking intentions, and reduced stigma. CONCLUSIONS The P2P program increased depression literacy through the use of youth-designed and youth-implemented depression awareness and outreach activities, which may ultimately result in earlier detection of depression and in fewer depression sequelae.


annual meeting of the special interest group on discourse and dialogue | 2013

Spoken Dialog Systems for Automated Survey Interviewing

Michael Johnston; Patrick Ehlen; Frederick G. Conrad; Michael F. Schober; Christopher Antoun; Stefanie Fail; Andrew L. Hupp; Lucas Vickers; Huiying Yan; Chan Zhang


Public Opinion Quarterly | 2017

Effects of Mobile versus PC Web on Survey Response Quality

Christopher Antoun; Mick P. Couper; Frederick G. Conrad


Archive | 2017

Text Interviews on Mobile Devices

Frederick G. Conrad; Michael F. Schober; Christopher Antoun; Andrew L. Hupp; H. Yanna Yan


Public Opinion Quarterly | 2017

Respondent mode choice in a smartphone survey

Frederick G. Conrad; Michael F. Schober; Christopher Antoun; H. Yanna Yan; Andrew L. Hupp; Michael V. Johnston; Patrick Ehlen; Lucas Vickers; Chan Zhang

Collaboration


Dive into the Christopher Antoun's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lin Wang

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian Falcone

United States Census Bureau

View shared research outputs
Top Co-Authors

Avatar

Elizabeth Nichols

United States Census Bureau

View shared research outputs
Researchain Logo
Decentralizing Knowledge