Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anbang Xu is active.

Publication


Featured researches published by Anbang Xu.


human factors in computing systems | 2014

Show me the money!: an analysis of project updates during crowdfunding campaigns

Anbang Xu; Xiao Yang; Huaming Rao; Wai Tat Fu; Shih Wen Huang; Brian P. Bailey

Hundreds of thousands of crowdfunding campaigns have been launched, but more than half of them have failed. To better understand the factors affecting campaign outcomes, this paper targets the content and usage patterns of project updates -- communications intended to keep potential funders aware of a campaigns progress. We analyzed the content and usage patterns of a large corpus of project updates on Kickstarter, one of the largest crowdfunding platforms. Using semantic analysis techniques, we derived a taxonomy of the types of project updates created during campaigns, and found discrepancies between the design intent of a project update and the various uses in practice (e.g. social promotion). The analysis also showed that specific uses of updates had stronger associations with campaign success than the projects description. Design implications were formulated from the results to help designers better support various uses of updates in crowdfunding campaigns.


conference on computer supported cooperative work | 2014

Voyant: generating structured feedback on visual designs using a crowd of non-experts

Anbang Xu; Shih Wen Huang; Brian P. Bailey

Feedback on designs is critical for helping users iterate toward effective solutions. This paper presents Voyant, a novel system giving users access to a non-expert crowd to receive perception-oriented feedback on their designs from a selected audience. Based on a formative study, the system generates the elements seen in a design, the order in which elements are noticed, impressions formed when the design is first viewed, and interpretation of the design relative to guidelines in the domain and the users stated goals. An evaluation of the system was conducted with users and their designs. Users reported the feedback about impressions and interpretation of their goals was most helpful, though the other feedback types were also valued. Users found the coordinated views in Voyant useful for analyzing relations between the crowds perception of a design and the visual elements within it. The cost of generating the feedback was considered a reasonable tradeoff for not having to organize critiques or interrupt peers.


conference on computer supported cooperative work | 2012

What do you think?: a case study of benefit, expectation, and interaction in a large online critique community

Anbang Xu; Brian P. Bailey

Critique is an indispensible part of creative work and many online communities have formed for this shared purpose. As design choices within the communities can impact the effectiveness of the critiques produced, it is important to study these communities and offer guidance for decisions. In this paper, we report the results of a case study exploring one large online community dedicated to critique in the domain of digital photography. We analyzed a large corpus of interaction data to understand the benefit of participation, the response dynamics, factors predicting critique ratings, and patterns of reciprocal interaction. Interviews with users were also conducted to uncover motives for participation and expectations of the critiques within the community. The results and insights gained from this work were distilled into recommendations for improving the design of systems that support community-based critique of creative artifacts.


conference on computer supported cooperative work | 2015

A Classroom Study of Using Crowd Feedback in the Iterative Design Process

Anbang Xu; Huaming Rao; Steven P. Dow; Brian P. Bailey

Crowd feedback systems offer designers an emerging approach for improving their designs, but there is little empirical evidence of the benefit of these systems. This paper reports the results of a study of using a crowd feedback system to iterate on visual designs. Users in an introductory visual design course created initial designs satisfying a design brief and received crowd feedback on the designs. Users revised the designs and the system was used to generate feedback again. This format enabled us to detect the changes between the initial and revised designs and how the feedback related to those changes. Further, we analyzed the value of crowd feedback by comparing it with expert evaluation and feedback generated via free-form prompts. Results showed that the crowd feedback system prompted deep and cosmetic changes and led to improved designs, the crowd recognized the design improvements, and structured workflows generated more interpretative, diverse and critical feedback than free-form prompts.


conference on computer supported cooperative work | 2012

A reference-based scoring model for increasing the findability of promising ideas in innovation pipelines

Anbang Xu; Brian P. Bailey

Idea pipelines enable open innovation within organizations but require the evaluation teams to assess large numbers of ideas. To help filter promising ideas, community voting is often included as part of the pipeline but the outcome of the voting rarely aligns with the ideas selected by the team. To address this problem, we introduce a new scoring model for increasing the findability of promising ideas within idea pipelines. In the model, each participant need only score a subset of the ideas, ideas are scored independently, and the individual scores can be aggregated. We tested the model on an authentic data set and found our model filters ideas chosen by an evaluation team better than community votes.


human factors in computing systems | 2011

A crowdsourcing model for receiving design critique

Anbang Xu; Brian P. Bailey

Designers in many domains are increasingly turning to online communities to receive critiques of early design ideas. However, members of these communities may not contribute an effective critique due to limited skills, motivation, or time, and therefore many critiques may not go beyond I (dont) like it. We propose a new approach for designers to receive online critique. Our approach is novel because it adopts a theoretical framework for effective critique and implements the framework on a popular crowdsourcing platform. Preliminary results show that our approach allows designers to acquire quality critiques in a timely manner that compare favorably with critiques produced from a well-known online community.


human factors in computing systems | 2013

CommunityCompare: visually comparing communities for online community leaders in the enterprise

Anbang Xu; Jilin Chen; Tara Matthews; Michael Muller; Hernan Badenes

Online communities are important in enterprises, helping workers to build skills and collaborate. Despite their unique and critical role fostering successful communities, community leaders have little direct support in existing technologies. We introduce CommunityCompare, an interactive visual analytic system to enable leaders to make sense of their communitys activity with comparisons. Composed of a parallel coordinates plot, various control widgets, and a preview of example posts from communities, the system supports comparisons with hundreds of related communities on multiple metrics and the ability to learn by example. We motivate and inform the system design with formative interviews of community leaders. From additional interviews, a field deployment, and surveys of leaders, we show how the system enabled leaders to assess community performance in the context of other comparable communities, learn about community dynamics through data exploration, and identify examples of top performing communities from which to learn. We conclude by discussing how our system and design lessons generalize.


human factors in computing systems | 2012

Learning how to feel again: towards affective workplace presence and communication technologies

Anbang Xu; Jacob T. Biehl; Eleanor G. Rieffel; Thea Turner; William van Melle

Affect influences workplace collaboration and thereby impacts a workplaces productivity. Participants in face-to-face interactions have many cues to each others affect, but work is increasingly carried out via computer-mediated channels that lack many of these cues. Current presence systems enable users to estimate the availability of other users, but not their affective states or communication preferences. This work demonstrates the feasibility of estimating affective state and communication preferences from a stream of presence states that are already being shared in a deployed presence system.


international conference on social computing | 2014

Emerging Dynamics in Crowdfunding Campaigns

Huaming Rao; Anbang Xu; Xiao Yang; Wai Tat Fu

Crowdfunding platforms are becoming more and more popular for fund-raising of entrepreneurial ventures, but the success rate of crowdfunding campaigns is found to be less than 50%. Recent research has shown that, in addition to the quality and representations of project ideas, dynamics of investment during a crowdfunding campaign also play an important role in determining its success. To further understand the role of investment dynamics, we did an exploratory analysis of the time series of money pledges to campaigns in Kickstarter to investigate the extent to which simple inflows and first-order derivatives can predict the eventual success of campaigns. Using decision tree models, we found that there were discrete stages in money pledges that predicted the success of crowdfunding campaigns. Specifically, we found that, for the majority of projects that had the default campaign duration of one month in Kickstarter, money pledges inflow occurring in the initial 10% and 40-60%, and the first order derivative of inflow at 95-100% of the duration of the campaigns had the strongest impact on the success of campaigns. In addition, merely utilizing the initial 15% money inflows, which could be regarded as “seed money”, to build a predictor can correctly predict 84% of the success of campaigns. Implication of current results to crowdfunding campaigns is also discussed.


conference on computer supported cooperative work | 2014

A system for receiving crowd feedback on visual designs

Anbang Xu; Brian P. Bailey

This paper proposes a demonstration of Voyant, a novel system giving users access to a non-expert crowd to receive structured feedback on the perceptions of their designs from a target audience. Voyant generates the elements seen in a design, the order in which elements are noticed, impressions formed when the design is first viewed, and interpretation of the design relative to guidelines in the domain and the users stated goals. The coordinated views in Voyant is designed to help user analyze relations between the crowds perception of a design and the visual elements within it.

Collaboration


Dive into the Anbang Xu's collaboration.

Top Co-Authors

Avatar

Huaming Rao

Nanjing University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Xiao Yang

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Steven P. Dow

University of California

View shared research outputs
Top Co-Authors

Avatar

Thea Turner

FX Palo Alto Laboratory

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge