Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Liz Allen is active.

Publication


Featured researches published by Liz Allen.


PLOS ONE | 2009

Looking for landmarks: the role of expert review and bibliometric analysis in evaluating scientific publication outputs.

Liz Allen; Ceri Jones; Kevin Dolby; David M. Lynn; Mark Walport

Objective To compare expert assessment with bibliometric indicators as tools to assess the quality and importance of scientific research papers. Methods and Materials Shortly after their publication in 2005, the quality and importance of a cohort of nearly 700 Wellcome Trust (WT) associated research papers were assessed by expert reviewers; each paper was reviewed by two WT expert reviewers. After 3 years, we compared this initial assessment with other measures of paper impact. Results Shortly after publication, 62 (9%) of the 687 research papers were determined to describe at least a ‘major addition to knowledge’ –6 were thought to be ‘landmark’ papers. At an aggregate level, after 3 years, there was a strong positive association between expert assessment and impact as measured by number of citations and F1000 rating. However, there were some important exceptions indicating that bibliometric measures may not be sufficient in isolation as measures of research quality and importance, and especially not for assessing single papers or small groups of research publications. Conclusion When attempting to assess the quality and importance of research papers, we found that sole reliance on bibliometric indicators would have led us to miss papers containing important results as judged by expert review. In particular, some papers that were highly rated by experts were not highly cited during the first three years after publication. Tools that link expert peer reviews of research paper quality and importance to more quantitative indicators, such as citation analysis would be valuable additions to the field of research assessment and evaluation.


PLOS Biology | 2014

Alternative Perspectives on Impact: The Potential of ALMs and Altmetrics to Inform Funders about Research Impact

Adam Dinsmore; Liz Allen; Kevin Dolby

More evidence of the meaning and validity of ALMs and altmetrics, coupled with greater consistency and transparency in their presentation, would enable research funders to explore their potential value and identify appropriate use cases.


Learned Publishing | 2015

Beyond authorship: attribution, contribution, collaboration, and credit

Amy Brand; Liz Allen; Micah Altman; Marjorie Hlava; Jo Scott

As the number of authors on scientific publications increases, ordered lists of author names are proving inadequate for the purposes of attribution and credit. A multi‐stakeholder group has produced a contributor role taxonomy for use in scientific publications. Identifying specific contributions to published research will lead to appropriate credit, fewer author disputes, and fewer disincentives to collaboration and the sharing of data and code.


Health Research Policy and Systems | 2012

Mapping global health research investments, time for new thinking - A Babel Fish for research data

Robert F Terry; Liz Allen; Charles Gardner; Javier Guzman; M. Moran; Roderik F Viergever

Today we have an incomplete picture of how much the world is spending on health and disease-related research and development (R&D). As such it is difficult to align, or even begin to coordinate, health R&D investments with international public health priorities.Current efforts to track and map global health research investments are complex, resource-intensive, and caveat-laden. An ideal situation would be for all research funding to be classified using a set of common standards and definitions. However, the adoption of such a standard by everyone is not a realistic, pragmatic or even necessary goal.It is time for new thinking informed by the innovations in automated online translation - e.g. Yahoos Babel Fish. We propose a feasibility study to develop a system that can translate and map the diverse research classification systems into a common standard, allowing the targeting of scarce research investments to where they are needed most.


BMJ Open | 2012

Tracking the impact of research on policy and practice: investigating the feasibility of using citations in clinical guidelines for research evaluation

David Kryl; Liz Allen; Kevin Dolby; Beverley Sherbon; Ian Viney

Objectives To investigate the feasibility of using research papers cited in clinical guidelines as a way to track the impact of particular funding streams or sources. Setting In recent years, medical research funders have made efforts to enhance the understanding of the impact of their funded research and to provide evidence of the ‘value’ of investments in particular areas of research. One of the most challenging areas of research evaluation is around impact on policy and practice. In the UK, the National Institute of Health and Clinical Excellence (NICE) provide clinical guidelines, which bring together current high-quality evidence on the diagnosis and treatment of clinical problems. Research referenced in these guidelines is an indication of its potential to have real impact on health policy and practice. Design This study is based on analysis of the authorship and funding attribution of research cited in two NICE clinical guidelines: dementia and chronic obstructive pulmonary disease. Results Analysis identified that around a third of papers cited in the two NICE guidelines had at least one author based in the UK. In both cases, about half of these UK attributed papers contained acknowledgements which allowed the source of funding for the research to be identified. The research cited in these guidelines was found to have been supported by a diverse set of funders from different sectors. The study also investigated the contribution of research groups based in universities, industry and the public sector. Conclusions The study found that there is great potential for guidelines to be used as sources of information on the quality of the research used in their development and that it is possible to track the source of the funding of the research. The challenge is in harnessing the relevant information to track this in an efficient way.


Research Evaluation | 1999

Evaluating high risk research: an assessment of the Wellcome Trust's Sir Henry Wellcome Commemorative Awards for Innovative Research

Jonathan Grant; Liz Allen

In 1996, the Wellcome Trust set up an annual competition to fund high-risk, non-obvious research in the biomedical sciences, called Showcase awards. The evaluation at the end of the first year involved an experiment to assess how innovative the Showcase awards were perceived to be, in comparison with a sample of standard project grants. Expert panel members were asked to assess how ‘risky’, ‘novel’, ‘speculative’, ‘adventurous’ and ‘innovative’ each of five Showcase and five project grants were. The results showed that Showcase is fulfilling its objective of supporting high risk research and also that it is possible to apply novel techniques to evaluate unusual schemes. By applying epidemiological methods, in the form of a masked randomised trial, as much systematic error was eliminated as possible, thus making the result more robust. Copyright , Beech Tree Publishing.


Bulletin of The World Health Organization | 2010

The art of evaluating the impact of medical science

Liz Allen

The medical research community has long considered research to be vital to the health and wealth of societies, sup-porting the view attributed to Mary Lasker, American philanthropist and ardent campaigner for medical research: “If you think research is expensive, try disease.” However, in recent years this community has come under increasing pressure to demonstrate the value and extent of the impacts of its labour.


Archive | 2015

The metric tide: report of the independent review of the role of metrics in research assessment and management

James Wilsdon; Liz Allen; Eleonora Belfiore; Philip Campbell; Stephen Curry; Steven A. Hill; Richard Jones; Roger Kain; Simon Kerridge; Mike Thelwall; Jane Tinkler; Ian Viney; Paul Wouters; Jude Hill; Ben Johnson


Nature | 2014

Publishing: Credit where credit is due

Liz Allen; Jo Scott; Amy Brand; Marjorie Hlava; Micah Altman


F1000Research | 2017

Assessment of Agreement Between Two Reviewers in the Open Post-publication Peer Review Process of F1000Research

Tiago Barros; Liz Allen

Collaboration


Dive into the Liz Allen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Amy Brand

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Micah Altman

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ian Viney

Medical Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge