Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael Drummond is active.

Publication


Featured researches published by Michael Drummond.


Value in Health | 2013

Consolidated Health Economic Evaluation Reporting Standards (CHEERS)—Explanation and Elaboration: A Report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force

Don Husereau; Michael Drummond; Stavros Petrou; Chris Carswell; David Moher; Dan Greenberg; Federico Augustovski; Andrew Briggs; Josephine Mauskopf; Elizabeth Loder

BACKGROUND Economic evaluations of health interventions pose a particular challenge for reporting because substantial information must be conveyed to allow scrutiny of study findings. Despite a growth in published reports, existing reporting guidelines are not widely adopted. There is also a need to consolidate and update existing guidelines and promote their use in a user-friendly manner. A checklist is one way to help authors, editors, and peer reviewers use guidelines to improve reporting. OBJECTIVE The task forces overall goal was to provide recommendations to optimize the reporting of health economic evaluations. The Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement is an attempt to consolidate and update previous health economic evaluation guidelines into one current, useful reporting guidance. The CHEERS Elaboration and Explanation Report of the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices Task Force facilitates the use of the CHEERS statement by providing examples and explanations for each recommendation. The primary audiences for the CHEERS statement are researchers reporting economic evaluations and the editors and peer reviewers assessing them for publication. METHODS The need for new reporting guidance was identified by a survey of medical editors. Previously published checklists or guidance documents related to reporting economic evaluations were identified from a systematic review and subsequent survey of task force members. A list of possible items from these efforts was created. A two-round, modified Delphi Panel with representatives from academia, clinical practice, industry, and government, as well as the editorial community, was used to identify a minimum set of items important for reporting from the larger list. RESULTS Out of 44 candidate items, 24 items and accompanying recommendations were developed, with some specific recommendations for single study-based and model-based economic evaluations. The final recommendations are subdivided into six main categories: 1) title and abstract, 2) introduction, 3) methods, 4) results, 5) discussion, and 6) other. The recommendations are contained in the CHEERS statement, a user-friendly 24-item checklist. The task force report provides explanation and elaboration, as well as an example for each recommendation. The ISPOR CHEERS statement is available online via Value in Health or the ISPOR Health Economic Evaluation Publication Guidelines Good Reporting Practices - CHEERS Task Force webpage (http://www.ispor.org/TaskForces/EconomicPubGuidelines.asp). CONCLUSIONS We hope that the ISPOR CHEERS statement and the accompanying task force report guidance will lead to more consistent and transparent reporting, and ultimately, better health decisions. To facilitate wider dissemination and uptake of this guidance, we are copublishing the CHEERS statement across 10 health economics and medical journals. We encourage other journals and groups to consider endorsing the CHEERS statement. The author team plans to review the checklist for an update in 5 years.


Health Economics | 1997

Modelling in Ecomomic Evaluation: An Unavoidable Fact of Life

Martin Buxton; Michael Drummond; Ben van Hout; Richard L. Prince; Trevor Sheldon; Thomas Szucs; Muriel Vray

The role of modelling in economic evaluation is explored by discussing, with examples, the uses of models. The expanded use of pragmatic clinical trials as an alternative to models is discussed. Some suggestions for good modelling practice are made.


The Lancet | 2002

A rational framework for decision making by the National Institute For Clinical Excellence (NICE)

Karl Claxton; Mark Sculpher; Michael Drummond

Regulatory and reimbursement authorities face uncertain choices when considering the adoption of health-care technologies. In this Viewpoint, we present an analytic framework that separates the issue of whether a technology should be adopted on the basis of existing evidence from whether more research should be demanded to support future decisions. We show the application of this framework to the assessment of heath-care technologies using a published analysis of a new drug treatment for Alzheimers disease. The results of the analysis show that the amount and type of evidence required to support the adoption of a health technology will differ substantially between technologies with different characteristics. Additionally, the analysis can be used to aid the efficient design of research. We discuss the implications of adoption of this new framework for regulatory and reimbursement decisions.


Medical Care | 1994

In Search of Power and Significance: Issues in the Design and Analysis of Stochastic Cost-effectiveness Studies in Health Care

Bernie J. O'Brien; Michael Drummond; Roberta Labelle; Andrew R. Willan

Application of techniques such as cost-effectiveness analysis (CEA) is growing rapidly in health care. There are two general approaches to analysis: deterministic models based upon assumptions and secondary analysis of retrospective data, and prospective stochastic analyses in which the design of a clinical experiment such as randomised controlled trial is adapted to collect patient-specific data on costs and effects. An important methodological difference between these two approaches is in the quantification and analysis of uncertainty. Whereas the traditional CEA model utilizes sensitivity analysis, the mean-variance data on costs and effects from a prospective trial presents the opportunity to analyze cost-effectiveness using conventional inferential statistical methods. In this study we explored some of the implications of moving economic appraisal away from deterministic models and toward the experimental paradigm. Our specific focus was on the feasibility and desirability of constructing statistical tests of economic hypotheses and estimation of cost-effectiveness ratios with associated 95% confidence intervals. We show how relevant variances can be estimated for this task and discuss the implications for the design and analysis of prospective economic studies.


International Journal of Technology Assessment in Health Care | 1991

Economic Analysis Alongside Clinical Trials: Revisiting the Methodological Issues

Michael Drummond; Linda Davies

Controlled clinical trials are recognized as the best source of data on the efficacy of health care interventions and technologies. Because economic evaluation is dependent on the quality of the underlying medical evidence, clinical trials have increasingly been viewed as a natural vehicle for economic analysis. However, the closer integration of economic and clinical research raises many methodological issues. This paper discusses these issues in trial design, collection of resource use data, collection of outcome data, and interpretation and extrapolation of results. Some guidelines are suggested for economic analysts wishing to undertake evaluations alongside clinical trials.


Value in Health | 2009

Transferability of economic evaluations across jurisdictions: ISPOR good research practices task force report

Michael Drummond; Marco Barbieri; John R. Cook; Henry A. Glick; Joanna Lis; Farzana Malik; Shelby D. Reed; Frans Rutten; Mark Sculpher; Johan L. Severens

ABSTRACT A growing number of jurisdictions now request economic data in support of their decision-making procedures for the pricing and/or reimbursement of health technologies. Because more jurisdictions request economic data, the burden on study sponsors and researchers increases. There are many reasons why the cost-effectiveness of health technologies might vary from place to place. Therefore, this report of an ISPOR Good Practices Task Force reviews what national guidelines for economic evaluation say about transferability, discusses which elements of data could potentially vary from place to place, and recommends good research practices for dealing with aspects of transferability, including strategies based on the analysis of individual patient data and based on decision-analytic modeling.


Annals of Medicine | 2001

Introducing economic and quality of life measurements into clinical studies

Michael Drummond

Although the collection of cost and quality of life data alongside clinical studies generates detailed patient level data in a timely fashion, it also raises practical and methodological challenges. These include the fact that the settings and patients enrolled in trials may not be typical of those found in regular clinical practice, that costs and quality of life may be influenced by the trial protocol, that the clinical alternatives compared in trials may not be the most relevant for cost-effectiveness assessments, that the length of follow-up may be too short to observe changes in cost and quality of life, and that adding these data will increase the overall measurement burden in the trial. This paper discusses these challenges and the ways in which they might be overcome, focussing particularly on preference-based measures of quality of life. In particular, recommendations are given for choosing the range of quality of life instruments, sample size calculations for quality of life measurement and the measurement of quality of life in multinational studies.


International Journal of Technology Assessment in Health Care | 1993

Standardizing Methodologies for Economic Evaluation in Health Care: Practice, Problems, and Potential

Michael Drummond; Arno Brandt; Bryan R. Luce; Joan Rovira

There has been an exponential growth in the literature on economic evaluation in health care. As the range and quality of analytical work has improved, economic studies are becoming more influential with health care decision makers. The development of standards for economic evaluation methods would help maintain the scientific quality of studies, facilitate the comparison of economic evaluation results for different health care interventions, and assist in the interpretation of results from setting to setting. However, standardization might unnecessarily stifle methodological developments. This paper reviews the arguments for and against standardization, assesses attempts to date, outlines the main areas of agreement and disagreement on methods for economic evaluation, and makes recommendations for further work.


BMJ | 1993

Some guidelines on the use of cost effectiveness league tables.

James Mason; Michael Drummond; George W. Torrance

Decisions to allocate resources in health care are increasingly influenced by relative cost effectiveness. To warn decision makers of some of the pitfalls currently found in cost effectiveness league tables and to suggest how meaningful comparisons may be made between health care technologies a published league table was scrutinised by examining its sources. This showed some of the methodological problems surrounding such tables and how such difficulties could be reduced in future. The source studies in the table featured different years of origin, discount rates, health state evaluations, settings, and types of comparison programmes; all of these differences may raise problems for meaningful comparison. Decision makers need to assess the relative value for money of competing health care interventions. In the absence of systematic comparisons such assessments are likely to take place informally. This will probably have a worse risk-benefit trade off than the formalized use of league tables.


PharmacoEconomics | 2001

Developing Guidance for Budget Impact Analysis

Paul Trueman; Michael Drummond; John Hutton

AbstractThe role of economic evaluation in the efficient allocation of healthcare resources has been widely debated. Whilst economic evidence is undoubtedly useful to purchasers, it does not address the issue of affordability which is an increasing concern. Healthcare purchasers are concerned not just with maximising efficiency but also with the more simplistic goal of remaining within their annual budgets. These two objectives are not necessarily consistent.This paper examines the issue of affordability, the relationship between affordability and efficiency and builds the case for why there is a growing need for budget impact models to complement economic evaluation. Guidance currently available for such models is also examined and it is concluded that this guidance is currently insufficient. Some of these insufficiencies are addressed and some thoughts on what constitutes best practice in budget impact modelling are suggested. These suggestions include consideration of transparency, clarity of perspective, reliability of data sources, the relationship between intermediate and final end-points and rates of adoption of new therapies. They also include the impact of intervention by population subgroups or indications, reporting of results, probability of re-deploying resources, the time horizon, exploring uncertainty and sensitivity analysis, and decision-maker access to the model. Due to the nature of budget impact models, the paper does not deliver stringent methodological guidance on modelling. The intention was to provide some suggestions of best practice in addition to some foundations upon which future research can build.

Collaboration


Dive into the Michael Drummond's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Corinna Sorenson

London School of Economics and Political Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Linda Davies

University of Manchester

View shared research outputs
Top Co-Authors

Avatar

Bengt Jönsson

Stockholm School of Economics

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark Sculpher

Brunel University London

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge