Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stefanie Haustein is active.

Publication


Featured researches published by Stefanie Haustein.


PLOS ONE | 2013

Do altmetrics work? Twitter and ten other social web services.

Mike Thelwall; Stefanie Haustein; Vincent Larivière; Cassidy R. Sugimoto

Altmetric measurements derived from the social web are increasingly advocated and used as early indicators of article impact and usefulness. Nevertheless, there is a lack of systematic scientific evidence that altmetrics are valid proxies of either impact or utility although a few case studies have reported medium correlations between specific altmetrics and citation rates for individual journals or fields. To fill this gap, this study compares 11 altmetrics with Web of Science citations for 76 to 208,739 PubMed articles with at least one altmetric mention in each case and up to 1,891 journals per metric. It also introduces a simple sign test to overcome biases caused by different citation and usage windows. Statistically significant associations were found between higher metric scores and higher citations for articles with positive altmetric scores in all cases with sufficient evidence (Twitter, Facebook wall posts, research highlights, blogs, mainstream media and forums) except perhaps for Google+ posts. Evidence was insufficient for LinkedIn, Pinterest, question and answer sites, and Reddit, and no conclusions should be drawn about articles with zero altmetric scores or the strength of any correlation between altmetrics and citations. Nevertheless, comparisons between citations and metric values for articles published at different times, even within the same year, can remove or reverse this association and so publishers and scientometricians should consider the effect of time when using altmetrics to rank articles. Finally, the coverage of all the altmetrics except for Twitter seems to be low and so it is not clear if they are prevalent enough to be useful in practice.


association for information science and technology | 2014

Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature

Stefanie Haustein; Isabella Peters; Cassidy R. Sugimoto; Mike Thelwall; Vincent Larivière

Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social‐media‐based metrics.


PLOS ONE | 2015

The Oligopoly of Academic Publishers in the Digital Era

Vincent Larivière; Stefanie Haustein; Philippe Mongeon

The consolidation of the scientific publishing industry has been the topic of much debate within and outside the scientific community, especially in relation to major publishers’ high profit margins. However, the share of scientific output published in the journals of these major publishers, as well as its evolution over time and across various disciplines, has not yet been analyzed. This paper provides such analysis, based on 45 million documents indexed in the Web of Science over the period 1973-2013. It shows that in both natural and medical sciences (NMS) and social sciences and humanities (SSH), Reed-Elsevier, Wiley-Blackwell, Springer, and Taylor & Francis increased their share of the published output, especially since the advent of the digital era (mid-1990s). Combined, the top five most prolific publishers account for more than 50% of all papers published in 2013. Disciplines of the social sciences have the highest level of concentration (70% of papers from the top five publishers), while the humanities have remained relatively independent (20% from top five publishers). NMS disciplines are in between, mainly because of the strength of their scientific societies, such as the ACS in chemistry or APS in physics. The paper also examines the migration of journals between small and big publishing houses and explores the effect of publisher change on citation impact. It concludes with a discussion on the economics of scholarly publishing.


association for information science and technology | 2015

Who reads research articles? An altmetrics analysis of Mendeley user categories

Ehsan Mohammadi; Mike Thelwall; Stefanie Haustein; Vincent Larivière

Little detailed information is known about who reads research articles and the contexts in which research articles are read. Using data about people who register in Mendeley as readers of articles, this article explores different types of users of Clinical Medicine, Engineering and Technology, Social Science, Physics, and Chemistry articles inside and outside academia. The majority of readers for all disciplines were PhD students, postgraduates, and postdocs but other types of academics were also represented. In addition, many Clinical Medicine articles were read by medical professionals. The highest correlations between citations and Mendeley readership counts were found for types of users who often authored academic articles, except for associate professors in some sub‐disciplines. This suggests that Mendeley readership can reflect usage similar to traditional citation impact if the data are restricted to readers who are also authors without the delay of impact measured by citation counts. At the same time, Mendeley statistics can also reveal the hidden impact of some research articles, such as educational value for nonauthor users inside academia or the impact of research articles on practice for readers outside academia.


PLOS ONE | 2015

Characterizing Social Media Metrics of Scholarly Papers: The Effect of Document Properties and Collaboration Patterns

Stefanie Haustein; Rodrigo Costas; Vincent Larivière

A number of new metrics based on social media platforms—grouped under the term “altmetrics”—have recently been introduced as potential indicators of research impact. Despite their current popularity, there is a lack of information regarding the determinants of these metrics. Using publication and citation data from 1.3 million papers published in 2012 and covered in Thomson Reuters’ Web of Science as well as social media counts from Altmetric.com, this paper analyses the main patterns of five social media metrics as a function of document characteristics (i.e., discipline, document type, title length, number of pages and references) and collaborative practices and compares them to patterns known for citations. Results show that the presence of papers on social media is low, with 21.5% of papers receiving at least one tweet, 4.7% being shared on Facebook, 1.9% mentioned on blogs, 0.8% found on Google+ and 0.7% discussed in mainstream media. By contrast, 66.8% of papers have received at least one citation. Our findings show that both citations and social media metrics increase with the extent of collaboration and the length of the references list. On the other hand, while editorials and news items are seldom cited, it is these types of document that are the most popular on Twitter. Similarly, while longer papers typically attract more citations, an opposite trend is seen on social media platforms. Finally, contrary to what is observed for citations, it is papers in the Social Sciences and humanities that are the most often found on social media platforms. On the whole, these findings suggest that factors driving social media and citations are different. Therefore, social media metrics cannot actually be seen as alternatives to citations; at most, they may function as complements to other type of indicators.


Journal of Informetrics | 2011

Applying social bookmarking data to evaluate journal usage

Stefanie Haustein; Tobias Siebenlist

Web 2.0 technologies are finding their way into academics: specialized social bookmarking services allow researchers to store and share scientific literature online. By bookmarking and tagging articles, academic prosumers generate new information about resources, i.e. usage statistics and content description of scientific journals. Given the lack of global download statistics, the authors propose the application of social bookmarking data to journal evaluation. For a set of 45 physics journals all 13,608 bookmarks from CiteULike, Connotea and BibSonomy to documents published between 2004 and 2008 were analyzed. This article explores bookmarking data in STM and examines in how far it can be used to describe the perception of periodicals by the readership. Four basic indicators are defined, which analyze different aspects of usage: Usage Ratio, Usage Diffusion, Article Usage Intensity and Journal Usage Intensity. Tags are analyzed to describe a reader-specific view on journal content.


Information Technology | 2014

Tweets vs. Mendeley readers: How do these two social media metrics differ

Stefanie Haustein; Vincent Larivière; Mike Thelwall; Didier Amyot; Isabella Peters

Abstract A set of 1.4 million biomedical papers was analyzed with regards to how often articles are mentioned on Twitter or saved by users on Mendeley. While Twitter is a microblogging platform used by a general audience to distribute information, Mendeley is a reference manager targeted at an academic user group to organize scholarly literature. Both platforms are used as sources for so-called “altmetrics” to measure a new kind of research impact. This analysis shows in how far they differ and compare to traditional citation impact metrics based on a large set of PubMed papers.


association for information science and technology | 2016

Tweets as impact indicators: Examining the implications of automated bot accounts on Twitter

Stefanie Haustein; Timothy D. Bowman; Kim Holmberg; Andrew Tsou; Cassidy R. Sugimoto; Vincent Larivière

This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific articles deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. Our results show that automated Twitter accounts create a considerable amount of tweets to scientific articles and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose distinguishing between different levels of engagement—that is, differentiating between tweeting only bibliographic information to discussing or commenting on the content of a scientific work.


Scientometrics | 2016

Grand challenges in altmetrics: heterogeneity, data quality and dependencies

Stefanie Haustein

With increasing uptake among researchers, social media are finding their way into scholarly communication and, under the umbrella term altmetrics, are starting to be utilized in research evaluation. Fueled by technological possibilities and an increasing demand to demonstrate impact beyond the scientific community, altmetrics have received great attention as potential democratizers of the scientific reward system and indicators of societal impact. This paper focuses on the current challenges for altmetrics. Heterogeneity, data quality and particular dependencies are identified as the three major issues and discussed in detail with an emphasis on past developments in bibliometrics. The heterogeneity of altmetrics reflects the diversity of the acts and online events, most of which take place on social media platforms. This heterogeneity has made it difficult to establish a common definition or conceptual framework. Data quality issues become apparent in the lack of accuracy, consistency and replicability of various altmetrics, which is largely affected by the dynamic nature of social media events. Furthermore altmetrics are shaped by technical possibilities and are particularly dependent on the availability of APIs and DOIs, strongly dependent on data providers and aggregators, and potentially influenced by the technical affordances of underlying platforms.


Journal of the Association for Information Science and Technology | 2017

Scholarly use of social media and altmetrics: a review of the literature

Cassidy R. Sugimoto; Sam Work; Vincent Larivière; Stefanie Haustein

Social media has become integrated into the fabric of the scholarly communication system in fundamental ways, principally through scholarly use of social media platforms and the promotion of new indicators on the basis of interactions with these platforms. Research and scholarship in this area has accelerated since the coining and subsequent advocacy for altmetrics—that is, research indicators based on social media activity. This review provides an extensive account of the state‐of‐the art in both scholarly use of social media and altmetrics. The review consists of 2 main parts: the first examines the use of social media in academia, reviewing the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system.

Collaboration


Dive into the Stefanie Haustein's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Timothy D. Bowman

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar

Isabella Peters

University of Düsseldorf

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mike Thelwall

University of Wolverhampton

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Tsou

Indiana University Bloomington

View shared research outputs
Researchain Logo
Decentralizing Knowledge