Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aaron Halfaker is active.

Publication


Featured researches published by Aaron Halfaker.


American Behavioral Scientist | 2013

The Rise and Decline of an Open Collaboration System How Wikipedia’s Reaction to Popularity Is Causing Its Decline

Aaron Halfaker; R. Stuart Geiger; Jonathan T. Morgan; John Riedl

Open collaboration systems, such as Wikipedia, need to maintain a pool of volunteer contributors to remain relevant. Wikipedia was created through a tremendous number of contributions by millions of contributors. However, recent research has shown that the number of active contributors in Wikipedia has been declining steadily for years and suggests that a sharp decline in the retention of newcomers is the cause. This article presents data that show how several changes the Wikipedia community made to manage quality and consistency in the face of a massive growth in participation have ironically crippled the very growth they were designed to manage. Specifically, the restrictiveness of the encyclopedia’s primary quality control mechanism and the algorithmic tools used to reject contributions are implicated as key causes of decreased newcomer retention. Furthermore, the community’s formal mechanisms for norm articulation are shown to have calcified against changes—especially changes proposed by newer editors.


international symposium on wikis and open collaboration | 2011

Don't bite the newbies: how reverts affect the quantity and quality of Wikipedia work

Aaron Halfaker; Aniket Kittur; John Riedl

Reverts are important to maintaining the quality of Wikipedia. They fix mistakes, repair vandalism, and help enforce policy. However, reverts can also be damaging, especially to the aspiring editor whose work they destroy. In this research we analyze 400,000 Wikipedia revisions to understand the effect that reverts had on editors. We seek to understand the extent to which they demotivate users, reducing the workforce of contributors, versus the extent to which they help users improve as encyclopedia editors. Overall we find that reverts are powerfully demotivating, but that their net influence is that more quality work is done in Wikipedia as a result of reverts than is lost by chasing editors away. However, we identify key conditions -- most specifically new editors being reverted by much more experienced editors - under which reverts are particularly damaging. We propose that reducing the damage from reverts might be one effective path for Wikipedia to solve the newcomer retention problem.


international symposium on wikis and open collaboration | 2009

A jury of your peers: quality, experience and ownership in Wikipedia

Aaron Halfaker; Aniket Kittur; Robert E. Kraut; John Riedl

Wikipedia is a highly successful example of what mass collaboration in an informal peer review system can accomplish. In this paper, we examine the role that the quality of the contributions, the experience of the contributors and the ownership of the content play in the decisions over which contributions become part of Wikipedia and which ones are rejected by the community. We introduce and justify a versatile metric for automatically measuring the quality of a contribution. We find little evidence that experience helps contributors avoid rejection. In fact, as they gain experience, contributors are even more likely to have their work rejected. We also find strong evidence of ownership behaviors in practice despite the fact that ownership of content is discouraged within Wikipedia.


conference on computer supported cooperative work | 2013

Making peripheral participation legitimate: reader engagement experiments in wikipedia

Aaron Halfaker; Oliver Keyes; Dario Taraborelli

Open collaboration communities thrive when participation is plentiful. Recent research has shown that the English Wikipedia community has constructed a vast and accurate information resource primarily through the monumental effort of a relatively small number of active, volunteer editors. Beyond Wikipedias active editor community is a substantially larger pool of potential participants: readers. In this paper we describe a set of field experiments using the Article Feedback Tool, a system designed to elicit lightweight contributions from Wikipedias readers. Through the lens of social learning theory and comparisons to related work in open bug tracking software, we evaluate the costs and benefits of the expanded participation model and show both qualitatively and quantitatively that peripheral contributors add value to an open collaboration community as long as the cost of identifying low quality contributions remains low.


international world wide web conferences | 2015

User Session Identification Based on Strong Regularities in Inter-activity Time

Aaron Halfaker; Oliver Keyes; Daniel Kluver; Jacob Thebault-Spieker; Tien T. Nguyen; Kenneth Shores; Anuradha Uduwage; Morten Warncke-Wang

Session identification is a common strategy used to develop metrics for web analytics and perform behavioral analyses of user-facing systems. Past work has argued that session identification strategies based on an inactivity threshold is inherently arbitrary or has advocated that thresholds be set at about 30 minutes. In this work, we demonstrate a strong regularity in the temporal rhythms of user initiated events across several different domains of online activity (incl. video gaming, search, page views and volunteer contributions). We describe a methodology for identifying clusters of user activity and argue that the regularity with which these activity clusters appear implies a good rule-of-thumb inactivity threshold of about 1 hour. We conclude with implications that these temporal rhythms may have for system design based on our observations and theories of goal-directed human activity.


international symposium on wikis and open collaboration | 2013

When the levee breaks: without bots, what happens to Wikipedia's quality control processes?

R. Stuart Geiger; Aaron Halfaker

In the first half of 2011, ClueBot NG -- one of the most prolific counter-vandalism bots in the English-language Wikipedia -- went down for four distinct periods, each period of downtime lasting from days to weeks. In this paper, we use these periods of breakdown as naturalistic experiments to study Wikipedias heterogeneous quality control network, which we analyze as a multi-tiered system in which distinct classes of reviewers use various reviewing technologies to patrol for different kinds of damage at staggered time periods. Our analysis showed that the overall time-to-revert edits was almost doubled when this software agent was down. Yet while a significantly fewer proportion of edits made during the bots downtime were reverted, we found that those edits were later eventually reverted. This suggests that other agents in Wikipedia took over this quality control work, but performed it at a far slower rate.


Proceedings of The International Symposium on Open Collaboration | 2014

Accept, decline, postpone: How newcomer productivity is reduced in English Wikipedia by pre-publication review

Jodi Schneider; Bluma Gelley; Aaron Halfaker

Wikipedia needs to attract and retain newcomers while also increasing the quality of its content. Yet new Wikipedia users are disproportionately affected by the quality assurance mechanisms designed to thwart spammers and promoters. English Wikipedias Articles for Creation provides a protected space for drafting new articles, which are reviewed against minimum quality guidelines before they are published. In this study we explore how this drafting process has affected the productivity of newcomers in Wikipedia. Using a mixed qualitative and quantitative approach, we show how the processs pre-publication review, which is intended to improve the success of newcomers, in fact decreases newcomer productivity in English Wikipedia and offer recommendations for system designers.


conference on computer supported cooperative work | 2013

Community, impact and credit: where should i submit my papers?

Aaron Halfaker; R. Stuart Geiger; Cliff Lampe; Loren G. Terveen; Amy Bruckman; Brian Keegan; Aniket Kittur; Geraldine Fitzpatrick

We (the authors of CSCWs program) have finite time and energy that can be invested into our publications and the research communities we value. While we want our work to have the most impact possible, we also want to grow and support productive research communities within which to have this impact. This panel discussion explores the costs and benefits of submitting papers to various tiers of conferences and journals surrounding CSCW and reflects on the value of investing hours into building up a research community.


Proceedings of the 2018 ACM Conference on Supporting Groupwork | 2018

Information Fortification: An Online Citation Behavior

Andrea Forte; Nazanin Andalibi; Tim Gorichanaz; Meen Chul Kim; Thomas H. Park; Aaron Halfaker

In this multi-method study, we examine citation activity on English-language Wikipedia to understand how information claims are supported in a non-scientific open collaboration context. We draw on three data sources-edit logs, interview data, and document analysis-to present an integrated interpretation of citation activity and found pervasive themes related to controversy and conflict. Based on this analysis, we present and discuss information fortification as a concept that explains online citation activity that arises from both naturally occurring and manufactured forms of controversy. This analysis challenges a workshop position paper from Group 2005 by Forte and Bruckman, which draws on Latours sociology of science and citation to explain citation in Wikipedia with a focus on credibility seeking. We discuss how information fortification differs from theories of citation that have arisen from bibliometrics scholarship and are based on scientific citation practices.


Proceedings of the 14th International Symposium on Open Collaboration | 2018

Evaluating the impact of the Wikipedia Teahouse on newcomer socialization and retention

Jonathan T. Morgan; Aaron Halfaker

Effective socialization of new contributors is vital for the long-term sustainability of open collaboration projects. Previous research has identified many common barriers to participation. However, few interventions employed to increase newcomer retention over the long term by improving aspects of the onboarding experience have demonstrated success. This study presents an evaluation of the impact of one such intervention, the Wikipedia Teahouse, on new editor survival. In a controlled experiment, we find that new editors invited to the Teahouse are retained at a higher rate than editors who do not receive an invite. The effect is observed for both low-and high-activity newcomers, and for both short- and long-term survival.

Collaboration


Dive into the Aaron Halfaker's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

John Riedl

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aniket Kittur

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert E. Kraut

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge