Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lars Kaczmirek is active.

Publication


Featured researches published by Lars Kaczmirek.


Social Science Computer Review | 2008

Prenotification in Web-Based Access Panel Surveys

Michael Bosnjak; Wolfgang Neubarth; Mick P. Couper; Wolfgang Bandilla; Lars Kaczmirek

To compare the effectiveness of different prenotification and invitation procedures in a web-based three-wave access panel survey over 3 consecutive months, we experimentally varied the contact mode in a fully crossed two-factorial design with (a) three different prenotification conditions (mobile short messaging service [SMS], e-mail, no prenotice) and (b) two “invitation and reminder” conditions (SMS, e-mail). A group with nearly complete mobile phone coverage was randomly assigned to one of these six experimental conditions. As expected, SMS prenotifications outperformed e-mail prenotifications in terms of response rates across all three waves. Furthermore, e-mail invitation response rates outperformed those for SMS invitations. The combination of SMS prenotification and e-mail invitation performed best. The different experimental treatments did not have an effect on the sample composition of respondents between groups.


Review of Finance | 2010

Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies

Marcel Das; Peter Ester; Lars Kaczmirek

Highlighting the progress made by researchers in using Web-based surveys for data collection, this timely volume summarizes the experiences of leading behavioral and social scientists from Europe and the US who collected data using the Internet. Some chapters present theory, methodology, design, and implementation, while others focus on best practice examples and/or issues such as data quality and understanding paradata. A number of contributors appliedinnovative Web-based research methods to the LISSpanel of CentERdata collected from over 5,000 Dutch households. Their findings are presented in the book. Some of the data is available on the book website. The book addresses practical issues such as data quality, how to reach difficult target groups, how to design a survey to maximize response, and ethical issues that need to be considered. Innovative applications such as the use of biomarkers and eye-tracking techniques are also explored. Part 1 provides an overview of Internet survey research including its methodologies, strengths, challenges, and best practices. Innovative ways to minimize sources of error are provided along with a review of mixed-mode designs, how to design a scientifically sound longitudinal panel and avoid sampling problems, and address ethical requirements in Web surveys. Part 2 focuses on advanced applications including the impact of visual design on the interpretability of survey questions, the impact survey usability has on respondents answers, design features that increase interaction, and how Internet surveys can be effectively used to study sensitive issues. Part 3 addresses data quality, sample selection, measurement and non-response error, and new applications for collecting online data. The issue of underrepresentation of certain groups in Internet research and the measures most effective at reducing it are also addressed. The book concludes with a discussion of the importance of paradata and the Web data collection process in general, followed by chapters with innovative experiments using eye-tracking techniques and biomarker data. This practical book appeals to practitioners from market survey research institutes and researchers in disciplines such as psychology, education, sociology, political science, health studies, marketing, economics, and business who use the Internet for data collection, but is also an ideal supplement for graduate and/or upper level undergraduate courses on (Internet) research methods and/or data collection taught in these fields.


Field Methods | 2013

Sample composition discrepancies in different stages of a probability-based online panel

Michael Bosnjak; Iris Haas; Mirta Galesic; Lars Kaczmirek; Wolfgang Bandilla; Mick P. Couper

We report sample composition discrepancies related to demographic and personality variables occurring in different stages of development of a probability-based online panel. The first stage—selecting eligible participants—produces differences between Internet users and nonusers in age, education, and gender distribution as well as in the personality traits of openness to experience, conscientiousness, and extraversion. The second and third stages of panel development—asking about willingness to participate in the panel and actual participation in online surveys—result in fewer and smaller discrepancies. The results suggest that among the three potential sources of sample composition bias considered, the largest impact comes from coverage differences with regard to Internet access.


Field Methods | 2014

How Do Respondents Attend to Verbal Labels in Rating Scales

Natalja Menold; Lars Kaczmirek; Timo Lenzner; Aleš Neusar

Two formats of labeling in rating scales are commonly used in questionnaires: verbal labels for end categories only (END form) and verbal labels for each of the categories (ALL form). We examine attention processes and respondents’ burden in using verbal labels in rating scales. Attention was tracked in a laboratory setting employing eye-tracking technology. The results of the two experiments are presented: One applied seven and the other applied five categories in rating scales comparing END and ALL forms (n = 47 in each experiment). The results show that the ALL form provides higher reliability, although the probability that respondents attend to a verbal label seems to decrease as the number of verbally labeled categories increases.


Field Methods | 2013

Testing the Validity of Gender Ideology Items by Implementing Probing Questions in Web Surveys

Dorothée Behr; Michael Braun; Lars Kaczmirek; Wolfgang Bandilla

This article examines the use of probing techniques in web surveys to identify validity problems of items. Conventional cognitive interviewing is usually based on small sample sizes and thus precludes quantifying the findings in a meaningful way or testing small or special subpopulations characterized by their response behavior. This article investigates probing in web surveys as a supplementary way to look at item validity. Data come from a web survey in which respondents were asked to give reasons for selecting a response category for a closed question. The web study was conducted in Germany, with respondents drawn from online panels (n = 1,023). The usefulness of the proposed approach is shown by revealing validity problems with a gender ideology item.


Sozialforschung im Internet: Methodologie und Praxis der Online-Befragung | 2009

Coverage- und Nonresponse-Effekte bei Online-Bevölkerungsumfragen

Wolfgang Bandilla; Lars Kaczmirek; Michael Blohm; Wolfgang Neubarth

Kaum eine Technologie hat sich so schnell verbreitet und das mediale und kommunikative Verhalten weiter Bevolkerungskreise verandert wie das Internet. Es kann deshalb auch nicht uberraschen, dass die Moglichkeiten dieser Technik innerhalb der Markt- und Sozialforschung zunehmend an Bedeutung gewinnen. Die auf das Jahr 2006 bezogenen Zahlen des Arbeitskreises Deutscher Markt- und Sozialforschungsinstitute e.V. (ADM)2 weisen z.B. aus, dass beim Vergleich der gangigen Befragungsarten der Anteil von Online-Interviews auf mittlerweile 21 Prozent gestiegen ist und damit weit uber dem traditionellschriftlicher Befragungen liegt (acht Prozent). Der Anteil personlich-mundlicher Interviews liegt mit 25 Prozent nur geringfugig hoher, wahrend telefonische Befragungen mit einem Anteil von 46 Prozent deutlich dominieren. Auch wenn nach wie vor Befragungen uberwiegend interviewer-administriert (d.h. telefonisch oder personlich-mundlich) durchgefuhrt werden, ist der kurze Zeitraum bemerkenswert, in dem die Online-Erhebung eine nennenswerte Grosenordnung erreicht hat: Lag ihr Anteil im Jahr 2002 bei lediglich funf Prozent, so hat er sich innerhalb von nur vier Jahren auf uber 20 Prozent erhoht.


international conference on universal access in human computer interaction | 2007

Survey design for visually impaired and blind people

Lars Kaczmirek; Klaus G. Wolff

This paper presents guidelines for the design of self-administered surveys for visually impaired and blind people within a mixed mode approach. The different needs of the target group are fulfilled by offering different modes of participation (paper-based, braille-based, Web-based). Reading aids have in common that they enhance the focus of a specific piece of text or single word. This advantage turns into a disadvantage in terms of a clear overview and arrangement of the text elements on a page. Therefore text needs to be designed with cognitive processes and accessibility standards in mind. This is especially true for a survey questionnaire where each question and answer item has to convey its own special meaning independent from context. Design problems and their solutions are described and illustrated with experiences from pretesting and a case study.


Social Science Computer Review | 2014

Cognitive Probes in Web Surveys: On the Effect of Different Text Box Size and Probing Exposure on Response Quality

Dorothée Behr; Wolfgang Bandilla; Lars Kaczmirek; Michael Braun

In this study, we explore to what extent the visual presentation of open-ended probes, in connection with different prior probing exposure, impacts on response quality. We experiment with two text box sizes for a specific immigrant probe (Which type of immigrants were you thinking of when you answered the question?). On the one hand, we use a standard size equal to the other text box sizes in the survey but oversized for the specific response task. On the other hand, we use a smaller text box which fits the response task. The other probes in the survey that use the standard text box are mainly category-selection probes that ask for a reasoning for the chosen answer value. Due to randomized rotation of questions, respondents receive different numbers of category-selection probes prior to the immigrant probe, resulting in different degrees of exposure to category-selection probing prior to the immigrant probe. For the immigrant probe, we find that respondents who receive the standard text box and who have had a high exposure to category-selection probing are more likely to provide mismatching answers: The mismatch consists of not answering the specific immigrant probe but rather providing a reasoning answer as typically expected for a category-selection probe. Thus, previous experience with probing in the questionnaire can override the actual probe wording. This problem can be minimized by considering possible carryover effects of prior probes and using an appropriate survey design strategy.


Social Science Computer Review | 2014

Left Feels Right: A Usability Study on the Position of Answer Boxes in Web Surveys

Timo Lenzner; Lars Kaczmirek; Mirta Galesic

The literature on human-computer interaction consistently stresses the importance of reducing the cognitive effort required by users who interact with a computer in order to improve the experience and enhance usability and comprehension. Applying this perspective to web surveys, questionnaire designers are advised to strive for layouts that facilitate the response process and reduce the effort required to select an answer. In this article, we examine whether placing the answer boxes (i.e., radio buttons or check boxes) to the left or to the right of the answer options in closed questions with vertically arranged response categories enhances usability and facilitates responding. First, we discuss a set of opposing principles of how respondents may process these types of questions in web surveys, some suggesting placing the answer boxes to the left and others suggesting placing them to the right side of the answer options. Second, we report an eye-tracking experiment that examined whether web survey responding is best described by one or another of these principles, and consequently whether one of the three layouts is preferable in terms of usability: (1) answer boxes to the left of left-aligned answer options, (2) answer boxes to the right of left-aligned answer options, and (3) answer boxes to the right of right-aligned answer options. Our results indicate that the majority of respondents conform to a principle suggesting placing the answer boxes to the left of left-aligned answer options. Moreover, respondents require less cognitive effort (operationalized by response latencies, fixation times, fixation counts, and number of gaze switches between answer options and answer boxes) to select an answer in this layout.


Archive | 2016

Improving the Accuracy of Automated Occupation Coding at Any Production Rate

Hyukjun Gweon; Matthias Schonlau; Lars Kaczmirek; Michael Blohm; Stefan H. Steiner

Occupation coding, an important task in official statistics, refers to coding a respondents text answer into one of many hundreds of occupation codes. To date, occupation coding is still at least partially conducted manually at great expense. We propose two new methods for automatic coding: a hybrid method that combines a rule-based approach based on duplicates with a statistical learning algorithm, and a modified nearest neighbor approach. Using data from the German General Social Survey (ALLBUS), we show that both methods improve on both the coding accuracy of the underlying statistical learning algorithm and the coding accuracy of duplicates where duplicates exist. We also find that statistical learning is improved by combining separate models for the detailed occupation codes and for aggregate occupation codes. Further, we and defing duplicates based on n-gram variables (a concept from text mining) is preferable to one based on exact string matches.

Collaboration


Dive into the Lars Kaczmirek's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael Bosnjak

Free University of Bozen-Bolzano

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge