Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Vincent Larivière is active.

Publication


Featured researches published by Vincent Larivière.


PLOS ONE | 2013

Do altmetrics work? Twitter and ten other social web services.

Mike Thelwall; Stefanie Haustein; Vincent Larivière; Cassidy R. Sugimoto

Altmetric measurements derived from the social web are increasingly advocated and used as early indicators of article impact and usefulness. Nevertheless, there is a lack of systematic scientific evidence that altmetrics are valid proxies of either impact or utility although a few case studies have reported medium correlations between specific altmetrics and citation rates for individual journals or fields. To fill this gap, this study compares 11 altmetrics with Web of Science citations for 76 to 208,739 PubMed articles with at least one altmetric mention in each case and up to 1,891 journals per metric. It also introduces a simple sign test to overcome biases caused by different citation and usage windows. Statistically significant associations were found between higher metric scores and higher citations for articles with positive altmetric scores in all cases with sufficient evidence (Twitter, Facebook wall posts, research highlights, blogs, mainstream media and forums) except perhaps for Google+ posts. Evidence was insufficient for LinkedIn, Pinterest, question and answer sites, and Reddit, and no conclusions should be drawn about articles with zero altmetric scores or the strength of any correlation between altmetrics and citations. Nevertheless, comparisons between citations and metric values for articles published at different times, even within the same year, can remove or reverse this association and so publishers and scientometricians should consider the effect of time when using altmetrics to rank articles. Finally, the coverage of all the altmetrics except for Twitter seems to be low and so it is not clear if they are prevalent enough to be useful in practice.


PLOS ONE | 2010

Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research

Yassine Gargouri; Chawki Hajjem; Vincent Larivière; Yves Gingras; Les Carr; Tim Brody; Stevan Harnad

Background Articles whose authors have supplemented subscription-based access to the publishers version by self-archiving their own final draft to make it accessible free for all on the web (“Open Access”, OA) are cited significantly more than articles in the same journal and year that have not been made OA. Some have suggested that this “OA Advantage” may not be causal but just a self-selection bias, because authors preferentially make higher-quality articles OA. To test this we compared self-selective self-archiving with mandatory self-archiving for a sample of 27,197 articles published 2002–2006 in 1,984 journals. Methdology/Principal Findings The OA Advantage proved just as high for both. Logistic regression analysis showed that the advantage is independent of other correlates of citations (article age; journal impact factor; number of co-authors, references or pages; field; article type; or country) and highest for the most highly cited articles. The OA Advantage is real, independent and causal, but skewed. Its size is indeed correlated with quality, just as citations themselves are (the top 20% of articles receive about 80% of all citations). Conclusions/Significance The OA advantage is greater for the more citable articles, not because of a quality bias from authors self-selecting what to make OA, but because of a quality advantage, from users self-selecting what to use and cite, freed by OA from the constraints of selective accessibility to subscribers only. It is hoped that these findings will help motivate the adoption of OA self-archiving mandates by universities, research institutions and research funders.


Scientometrics | 2006

Benchmarking Scientific Output in the Social Sciences and Humanities: The Limits of Existing Databases

Éric Archambault; Etienne Vignola-Gagné; Grégoire Côté; Vincent Larivière; Yves GINGRASb

SummaryThe goal of this paper is to examine the impact of linguistic coverage of databases used by bibliometricians on the capacity to effectively benchmark the work of researchers in social sciences and humanities. We examine the strong link between bibliometrics and the Thomson Scientifics database and review the differences in the production and diffusion of knowledge in the social sciences and humanities (SSH) and the natural sciences and engineering (NSE). This leads to a re-examination of the debate on the coverage of these databases, more specifically in the SSH. The methods section explains how we have compared the coverage of Thomson Scientific databases in the NSE and SSH to the Ulrich extensive database of journals. Our results show that there is a 20 to 25% overrepresentation of English-language journals in Thomson Scientifics databases compared to the list of journals presented in Ulrich. This paper concludes that because of this bias, Thomson Scientific databases cannot be used in isolation to benchmark the output of countries in the SSH.


association for information science and technology | 2014

Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature

Stefanie Haustein; Isabella Peters; Cassidy R. Sugimoto; Mike Thelwall; Vincent Larivière

Data collected by social media platforms have been introduced as new sources for indicators to help measure the impact of scholarly research in ways that are complementary to traditional citation analysis. Data generated from social media activities can be used to reflect broad types of impact. This article aims to provide systematic evidence about how often Twitter is used to disseminate information about journal articles in the biomedical sciences. The analysis is based on 1.4 million documents covered by both PubMed and Web of Science and published between 2010 and 2012. The number of tweets containing links to these documents was analyzed and compared to citations to evaluate the degree to which certain journals, disciplines, and specialties were represented on Twitter and how far tweets correlate with citation impact. With less than 10% of PubMed articles mentioned on Twitter, its uptake is low in general but differs between journals and specialties. Correlations between tweets and citations are low, implying that impact metrics based on tweets are different from those based on citations. A framework using the coverage of articles and the correlation between Twitter mentions and citations is proposed to facilitate the evaluation of novel social‐media‐based metrics.


Journal of the Association for Information Science and Technology | 2006

The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities

Vincent Larivière; Éric Archambault; Yves Gingras; Etienne Vignola-Gagné

Journal articles constitute the core documents for the diffusion of knowledge in the natural sciences. It has been argued that the same is not true for the social sciences and humanities where knowledge is more often disseminated in monographs that are not indexed in the journal-based databases used for bibliometric analysis. Previous studies have made only partial assessments of the role played by both serials and other types of literature. The importance of journal literature in the various scientific fields has therefore not been systematically characterized. The authors address this issue by providing a systematic measurement of the role played by journal literature in the building of knowledge in both the natural sciences and engineering and the social sciences and humanities. Using citation data from the CD-ROM versions of the Science Citation Index (SCI), Social Science Citation Index (SSCI), and Arts and Humanities Citation Index (AHCI) databases from 1981 to 2000 (Thomson ISI, Philadelphia, PA), the authors quantify the share of citations to both serials and other types of literature. Variations in time and between fields are also analyzed. The results show that journal literature is increasingly important in the natural and social sciences, but that its role in the humanities is stagnant and has even tended to diminish slightly in the 1990s. Journal literature accounts for less than 50% of the citations in several disciplines of the social sciences and humanities; hence, special care should be used when using bibliometric indicators that rely only on journal literature.


Scientometrics | 2009

History of the journal impact factor: Contingencies and consequences

Éric Archambault; Vincent Larivière

This paper examines the genesis of journal impact measures and how their evolution culminated in the journal impact factor (JIF) produced by the Institute for Scientific Information. The paper shows how the various building blocks of the dominant JIF (published in the Journal Citation Report - JCR) came into being. The paper argues that these building blocks were all constructed fairly arbitrarily or for different purposes than those that govern the contemporary use of the JIF. The results are a faulty method, widely open to manipulation by journal editors and misuse by uncritical parties. The discussion examines some solution offered to the bibliometrics and scientific communities considering the wide use of this indicator at present.


Scientometrics | 2006

Canadian collaboration networks: A comparative analysis of the natural sciences, social sciences and the humanities

Vincent Larivière; Yves Gingras; Éric Archambault

SummaryA basic dichotomy is generally made between publication practices in the natural sciences and engineering (NSE) on the one hand and social sciences and humanities (SSH) on the other. However, while researchers in the NSE share some common practices with researchers in SSH, the spectrum of practices is broader in the latter. Drawing on data from the CD-ROM versions of the Science Citation Index, SocialSciences Citation Index and the Arts & Humanities Citation Index from 1980 to 2002, this paper compares collaboration patterns in the SSH to those in the NSE. We show that, contrary to a widely held belief, researchers in the social sciences and the humanities do not form a homogeneous category. In fact, collaborative activities of researchers in the social sciences are more comparable to those of researchers in the NSE than in the humanities. Also, we see that language and geographical proximity influences the choice of collaborators in the SSH, but also in the NSE. This empirical analysis, which sheds a new light on the collaborative activities of researchers in the NSE compared to those in the SSH, may have policy implications as granting councils in these fields have a tendency to imitate programs developed for the NSE, without always taking into account the specificity of the humanities.


PLOS ONE | 2015

The Oligopoly of Academic Publishers in the Digital Era

Vincent Larivière; Stefanie Haustein; Philippe Mongeon

The consolidation of the scientific publishing industry has been the topic of much debate within and outside the scientific community, especially in relation to major publishers’ high profit margins. However, the share of scientific output published in the journals of these major publishers, as well as its evolution over time and across various disciplines, has not yet been analyzed. This paper provides such analysis, based on 45 million documents indexed in the Web of Science over the period 1973-2013. It shows that in both natural and medical sciences (NMS) and social sciences and humanities (SSH), Reed-Elsevier, Wiley-Blackwell, Springer, and Taylor & Francis increased their share of the published output, especially since the advent of the digital era (mid-1990s). Combined, the top five most prolific publishers account for more than 50% of all papers published in 2013. Disciplines of the social sciences have the highest level of concentration (70% of papers from the top five publishers), while the humanities have remained relatively independent (20% from top five publishers). NMS disciplines are in between, mainly because of the strength of their scientific societies, such as the ACS in chemistry or APS in physics. The paper also examines the migration of journals between small and big publishing houses and explores the effect of publisher change on citation impact. It concludes with a discussion on the economics of scholarly publishing.


association for information science and technology | 2015

Who reads research articles? An altmetrics analysis of Mendeley user categories

Ehsan Mohammadi; Mike Thelwall; Stefanie Haustein; Vincent Larivière

Little detailed information is known about who reads research articles and the contexts in which research articles are read. Using data about people who register in Mendeley as readers of articles, this article explores different types of users of Clinical Medicine, Engineering and Technology, Social Science, Physics, and Chemistry articles inside and outside academia. The majority of readers for all disciplines were PhD students, postgraduates, and postdocs but other types of academics were also represented. In addition, many Clinical Medicine articles were read by medical professionals. The highest correlations between citations and Mendeley readership counts were found for types of users who often authored academic articles, except for associate professors in some sub‐disciplines. This suggests that Mendeley readership can reflect usage similar to traditional citation impact if the data are restricted to readers who are also authors without the delay of impact measured by citation counts. At the same time, Mendeley statistics can also reveal the hidden impact of some research articles, such as educational value for nonauthor users inside academia or the impact of research articles on practice for readers outside academia.


PLOS ONE | 2015

Characterizing Social Media Metrics of Scholarly Papers: The Effect of Document Properties and Collaboration Patterns

Stefanie Haustein; Rodrigo Costas; Vincent Larivière

A number of new metrics based on social media platforms—grouped under the term “altmetrics”—have recently been introduced as potential indicators of research impact. Despite their current popularity, there is a lack of information regarding the determinants of these metrics. Using publication and citation data from 1.3 million papers published in 2012 and covered in Thomson Reuters’ Web of Science as well as social media counts from Altmetric.com, this paper analyses the main patterns of five social media metrics as a function of document characteristics (i.e., discipline, document type, title length, number of pages and references) and collaborative practices and compares them to patterns known for citations. Results show that the presence of papers on social media is low, with 21.5% of papers receiving at least one tweet, 4.7% being shared on Facebook, 1.9% mentioned on blogs, 0.8% found on Google+ and 0.7% discussed in mainstream media. By contrast, 66.8% of papers have received at least one citation. Our findings show that both citations and social media metrics increase with the extent of collaboration and the length of the references list. On the other hand, while editorials and news items are seldom cited, it is these types of document that are the most popular on Twitter. Similarly, while longer papers typically attract more citations, an opposite trend is seen on social media platforms. Finally, contrary to what is observed for citations, it is papers in the Social Sciences and humanities that are the most often found on social media platforms. On the whole, these findings suggest that factors driving social media and citations are different. Therefore, social media metrics cannot actually be seen as alternatives to citations; at most, they may function as complements to other type of indicators.

Collaboration


Dive into the Vincent Larivière's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yves Gingras

Université du Québec à Montréal

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Éric Archambault

Université du Québec à Montréal

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Benoit Macaluso

Université du Québec à Montréal

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Tsou

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar

Mike Thelwall

University of Wolverhampton

View shared research outputs
Researchain Logo
Decentralizing Knowledge