Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jason Priem is active.

Publication


Featured researches published by Jason Priem.


PLOS ONE | 2012

The Altmetrics Collection

Jason Priem; Paul T. Groth; Dario Taraborelli

What paper should I read next? Who should I talk to at a conference? Which research group should get this grant? Researchers and funders alike must make daily judgments on how to best spend their limited time and money–judgments that are becoming increasingly difficult as the volume of scholarly communication increases. Not only does the number of scholarly papers continue to grow, it is joined by new forms of communication from data publications to microblog posts. To deal with incoming information, scholars have always relied upon filters. At first these filters were manually compiled compendia and corpora of the literature. But by the mid-20th century, filters built on manual indexing began to break under the weight of booming postwar science production. Garfield [1] and others pioneered a solution: automated filters that leveraged scientists own impact judgments, aggregating citations as “pellets of peer recognition.” [2]. These citation-based filters have dramatically grown in importance and have become the tenet of how research impact is measured. But, like manual indexing 60 years ago, they may today be failing to keep up with the literature’s growing volume, velocity, and diversity [3]. Citations are heavily gamed [4]–[6] and are painfully slow to accumulate [7], and overlook increasingly important societal and clinical impacts [8]. Most importantly, they overlook new scholarly forms like datasets, software, and research blogs that fall outside of the scope of citable research objects. In sum, citations only reflect formal acknowledgment and thus they provide only a partial picture of the science system [9]. Scholars may discuss, annotate, recommend, refute, comment, read, and teach a new finding before it ever appears in the formal citation registry. We need new mechanisms to create a subtler, higher-resolution picture of the science system. The Quest for Better Filters The scientometrics community has not been blind to the limitations of citation measures, and has collectively proposed methods to gather evidence of broader impacts and provide more detail about the science system: tracking acknowledgements [10], patents [11], mentorships [12], news articles [8], usage in syllabuses [13], and many others, separately and in various combinations [14]. The emergence of the Web, a “nutrient-rich space for scholars” [15], has held particular promise for new filters and lenses on scholarly output. Webometrics researchers have uncovered evidence of informal impact by examining networks of hyperlinks and mentions on the broader Web [16]–[18]. An important strand of webometrics has also examined the properties of article download data [7], [19], [20]. The last several years, however, have presented a promising new approach to gathering fine-grained impact data: tracking large-scale activity around scholarly products in online tools and environments. These tools and environments include, among others: social media like Twitter and Facebook online reference managers like CiteULike, Zotero, and Mendeley collaborative encyclopedias like Wikipedia blogs, both scholarly and general-audience scholarly social networks, like ResearchGate or Academia.edu conference organization sites like Lanyrd.com Growing numbers of scholars are using these and similar tools to mediate their interaction with the literature. In doing so, they are leaving valuable tracks behind them–tracks with potential to show informal paths of influence with unprecedented speed and resolution. Many of these tools offer open APIs, supporting large-scale, automated mining of online activities and conversations around research objects [21]. Altmetrics [22], [23] is the study and use of scholarly impact measures based on activity in online tools and environments. The term has also been used to describe the metrics themselves–one could propose in plural a “set of new altmetrics.” Altmetrics is in most cases a subset of both scientometrics and webometrics; it is a subset of the latter in that it focuses more narrowly on scholarly influence as measured in online tools and environments, rather than on the Web more generally. Altmetrics may support finer-grained maps of science, broader and more equitable evaluations, and improvements to the peer-review system [24]. On the other hand, the use and development of altmetrics should be pursued with appropriate scientific caution. Altmetrics may face attempts at manipulation similar to what Google must deal with in web search ranking. Addressing such manipulation may, in-turn, impact the transparency of altmetrics. New and complex measures may distort our picture of the science system if not rigorously assessed and correctly understood. Finally, altmetrics may promote an evaluation system for scholarship that many argue has become overly focused on metrics.


PeerJ | 2018

The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles

Heather A. Piwowar; Jason Priem; Vincent Larivière; Juan Pablo Alperin; Lisa Matthias; Bree Norlander; Ashley Farley; Jevin D. West; Stefanie Haustein

Despite growing interest in Open Access (OA) to scholarly literature, there is an unmet need for large-scale, up-to-date, and reproducible studies assessing the prevalence and characteristics of OA. We address this need using oaDOI, an open online service that determines OA status for 67 million articles. We use three samples, each of 100,000 articles, to investigate OA in three populations: (1) all journal articles assigned a Crossref DOI, (2) recent journal articles indexed in Web of Science, and (3) articles viewed by users of Unpaywall, an open-source browser extension that lets users find OA articles using oaDOI. We estimate that at least 28% of the scholarly literature is OA (19M in total) and that this proportion is growing, driven particularly by growth in Gold and Hybrid. The most recent year analyzed (2015) also has the highest percentage of OA (45%). Because of this growth, and the fact that readers disproportionately access newer articles, we find that Unpaywall users encounter OA quite frequently: 47% of articles they view are OA. Notably, the most common mechanism for OA is not Gold, Green, or Hybrid OA, but rather an under-discussed category we dub Bronze: articles made free-to-read on the publisher website, without an explicit Open license. We also examine the citation impact of OA articles, corroborating the so-called open-access citation advantage: accounting for age and discipline, OA articles receive 18% more citations than average, an effect driven primarily by Green and Hybrid OA. We encourage further research using the free oaDOI service, as a way to inform OA policy and practice.


Insights: The UKSG Journal | 2013

Uncovering impacts: a case study in using altmetrics tools

Jason Priem; Cristhian Parra; Heather A. Piwowar; Paul Groth; Andra Waagmeester

Altmetrics were born from a desire to see and measure research impact differently. Complementing traditional citation analysis, altmetrics are intended to reflect more broad views of research impact by taking into account the use of digital scholarly communication tools. Aggregating online attention paid to individual scholarly articles and data sets is the approach taken by Altmetric LLP, an altmetrics tool provider. Potential uses for article-level metrics collected by Altmetric include: 1) the assessment of an articles impact within a particular community, 2) the assessment of the overall impact of a body of scholarly work, and 3) the characterization of entire author and reader communities that engage with particular articles online. Although attention metrics are still being refined, qualitative altmetrics data are beginning to illustrate the rich new world of scholarly communication, and are emerging as ways to highlight the immediate societal impacts of research.


Journal of Electronic Publishing | 2014

The Imperative for Open Altmetrics

Stacy Konkiel; Heather A. Piwowar; Jason Priem

If scholarly communication is broken, how will we fix it? At Impactstory—a non-profit devoted to helping scholars gather and share evidence of their research impact by tracking online usage of scholarship via blogs, Wikipedia, Mendeley, and more—we believe that incentivizing web-native research via altmetrics is the place to start. In this article, we describe the current state of the art in altmetrics and its effects on publishing, we share Impactstory’s plan to build an open infrastructure for altmetrics, and describe our company’s ethos and actions. “Scholarly communication is broken.” We’ve heard this refrain for close to twenty years now, but what does it mean? Academic publishing is still mostly a slow, arduous, and closed process. Researchers have little incentive to experiment with new forms of scholarly communication or make their research freely available at the speed of science, since they’re mainly recognized for publishing journal articles and books: a narrow, very traditional form of scholarly impact. Most arguments attribute academic publishing’s problems to a system that benefits corporate interests or to perverse incentives for tenure and promotion. The solution? Open up research and update our incentive systems accordingly. For too long now, academic publishing has relied on a closed infrastructure that was architected to serve commercial interests. Researchers who attempt to practice open science can find it difficult to get recognition for the impact of open access (OA) publications and research products beyond the journal article, products that include scientific software, data, and so on. Some have already imagined a better future for scholarly communication, one where OA is the norm and a new, open infrastructure serves the diverse needs of scholars throughout the research lifecycle. The decoupled journal is slowly becoming a reality, [1] [#N1] OA publications continue to gain a market share, [2] [#N2] and measuring impact of a diverse set of scholarly outputs through altmetrics is becoming an increasingly common practice for scholars. [3] [#N3] We founded Impactstory with this future in mind. Impactstory [http://impactstory.org] is a non-profit, open source web application that helps researchers gather, understand, and share with others the impact of all their scholarly outputs. We believe that Impactstory and other services that serve scholarly communication are essential to the future of academia. The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/... 1 of 12 11/3/14, 9:06 PM In this article, we’ll describe the current state of the art in altmetrics and its effects on publishing, share our plan to build an open infrastructure for altmetrics, and describe our company’s ethos and actions. The current publishing ecosystem—and why it needs to be changed Altmetrics—sometimes called “alternative metrics” and defined by Priem, Piwowar, & Hemminger as social media-based metrics for scholarly works [4] [#N4] —are having a major effect on traditional scholarly publishing, but not for all of the reasons you might expect. Traditional academic publishers are masters of vertical integration. Once a manuscript is submitted to a traditional journal for publication, that journal coordinates peer-review, copy-edits, publishes, markets, manages copyright for, and provides scores of other services [5] [#N5] for the published article. In general, this system has done its job relatively well to date—publishing pay-to-read journals. But it has also resulted in a publishing ecosystem that can be harmful to scholars and the public [6] [#N6] : toll access journals with exorbitant subscription fees (as the for-profit publishers seek to expand their ever-widening profit margin [7] [#N7] ) and journal impact factors being used as a proxy for the quality of a published article when evaluating scholars’ work (not the fault of the publishers, to be sure, but they nonetheless contribute to the problem by promoting and sustaining JIF hype). What if we imagined a web-native publishing ecosystem that functioned in an open, networked manner, similar to how much research itself is conducted nowadays? What if we decoupled the services that many journals provide from the journal itself, and had scores of businesses that could provide many of the essential services that authors need, like peer-review, copy editing, marketing—with less overhead and greater transparency? Such a system has the opportunity to foster a scholarly communication environment that benefits scholars and the public, freeing the literature via Open Access publishing, improving the literature through open and post-publication peer review, and understanding the literature’s impact through article-level metrics and altmetrics. Luckily, that new system is in the process of being built. Every day, game-changing publishing services like Publons [https://publons.com/] and Rubriq [http://www.rubriq.com/] (stand-alone peer-review services [8] [#N8] ), Annotum [http://annotum.org/] and PressForward [http://pressforward.org/] (publishing platforms), Dryad [http://datadryad.org/] and Figshare [http://figshare.com/] (data-sharing platforms), and Kudos [https://www.growkudos.com/] (an article marketing service) are debuted. And altmetrics services like Impactstory [https://impactstory.org/] , Altmetric [http://www.altmetric.com/] , PlumX [https://plu.mx/] , and PLOS ALMs [http://article-levelmetrics.plos.org/] are also starting to be widely adopted, by both publishers and scholars alike. The rise of altmetrics Altmetrics are a solution to a problem that increasingly plagues scholars: even in situations where scholarship may be best served by a publishing a dataset, blog post, or other web-native scholarly product, one’s own career is often better served by instead putting that effort into traditional article-writing. If we want to move to a more efficient, web-native science, we must make that The Imperative for Open Altmetrics http://quod.lib.umich.edu/cgi/t/text/idx/j/jep/3336451.0017.301/... 2 of 12 11/3/14, 9:06 PM dilemma disappear: what is good for scholarship must become good for the scholar. Instead of assessing only paper-native articles, books, and proceedings, we must build a new system where all types of scholarly products are evaluated and rewarded. The key to this new reward system is altmetrics: a broad suite of online impact indicators that goes beyond traditional citations to measure impacts of diverse products, in diverse platforms, on diverse groups of people. [9] [#N9] Altmetrics leverage the increasing centrality of the Web in scholarly communication, mining evidence of impact across a range of online tools and environments: [/j/jep/images/3336451.0017.301-00000001.jpg] These and other altmetrics promise to bridge the gap between the potential of web-native scholarship and the limitations of the paper-native scholarly reward system. A growing body of research supports the validity and potential usefulness of altmetrics. [10] [#N10] [11] [#N11] [12] [#N12] [13] [#N13] Eventually, these new metrics may power not only research evaluation, but also web-native filtering and recommendation tools. [14] [#N14] [15] [#N15] [16] [#N16] However, this vision of efficient, altmetrics-powered, and web-native scholarship will not occur accidentally. It requires advocacy to promote the value of altmetrics and web-native scholarship, online tools to demonstrate the immediate value of altmetrics as an assessment approach today, and an open data infrastructure to support developers as they create a new, web-native scholarly ecosystem. This is where Impactstory comes in.


First Monday | 2010

Scientometrics 2.0: New metrics of scholarly impact on the social Web

Jason Priem; Bradely H. Hemminger


arXiv: Digital Libraries | 2012

Altmetrics in the wild: Using social media to explore scholarly impact

Jason Priem; Heather A. Piwowar; Bradley M. Hemminger


ASIS&T '10 Proceedings of the 73rd ASIS&T Annual Meeting on Navigating Streams in an Information Ecosystem - Volume 47 | 2010

How and why scholars cite on Twitter

Jason Priem; Kaitlin Light Costello


Scientometrics | 2014

Coverage and adoption of altmetrics sources in the bibliometric community

Stefanie Haustein; Isabella Peters; Judit Bar-Ilan; Jason Priem; Hadas Shema; Jens Terliesner


arXiv: Digital Libraries | 2012

Beyond citations: Scholars' visibility on the social Web

Judit Bar-Ilan; Stefanie Haustein; Isabella Peters; Jason Priem; Hadas Shema; Jens Terliesner


association for information science and technology | 2013

The power of altmetrics on a CV

Heather A. Piwowar; Jason Priem

Collaboration


Dive into the Jason Priem's collaboration.

Top Co-Authors

Avatar

Heather A. Piwowar

National Evolutionary Synthesis Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bradley M. Hemminger

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Isabella Peters

University of Düsseldorf

View shared research outputs
Top Co-Authors

Avatar

Jens Terliesner

University of Düsseldorf

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alex Garnett

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge