Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ray Pawson is active.

Publication


Featured researches published by Ray Pawson.


Journal of Health Services Research & Policy | 2005

Realist review - a new method of systematic review designed for complex policy interventions:

Ray Pawson; Trisha Greenhalgh; Gill Harvey; Kieran Walshe

Evidence-based policy is a dominant theme in contemporary public services but the practical realities and challenges involved in using evidence in policy-making are formidable. Part of the problem is one of complexity. In health services and other public services, we are dealing with complex social interventions which act on complex social systems-things like league tables, performance measures, regulation and inspection, or funding reforms. These are not ‘magic bullets‘ which will always hit their target, but programmes whose effects are crucially dependent on context and implementation. Traditional methods of review focus on measuring and reporting on programme effectiveness, often find that the evidence is mixed or conflicting, and provide little or no clue as to why the intervention worked or did not work when applied in different contexts or circumstances, deployed by different stakeholders, or used for different purposes. This paper offers a model of research synthesis which is designed to work with complex social interventions or programmes, and which is based on the emerging ‘realist’ approach to evaluation. It provides an explanatory analysis aimed at discerning what works for whom, in what circumstances, in what respects and how. The first step is to make explicit the programme theory (or theories) - the underlying assumptions about how an intervention is meant to work and what impacts it is expected to have. We then look for empirical evidence to populate this theoretical framework, supporting, contradicting or modifying the programme theories as it goes. The results of the review combine theoretical understanding and empirical evidence, and focus on explaining the relationship between the context in which the intervention is applied, the mechanisms by which it works and the outcomes which are produced. The aim is to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively. Realist review does not provide simple answers to complex questions. It will not tell policy-makers or managers whether something works or not, but will provide the policy and practice community with the kind of rich, detailed and highly practical understanding of complex social interventions which is likely to be of much more use to them when planning and implementing programmes at a national, regional or local level.


BMC Medicine | 2013

RAMESES publication standards: realist syntheses

Geoff Wong; Trish Greenhalgh; Gill Westhorp; Jeanette Buckingham; Ray Pawson

BackgroundThere is growing interest in realist synthesis as an alternative systematic review method. This approach offers the potential to expand the knowledge base in policy-relevant areas -for example, by explaining the success, failure or mixed fortunes of complex interventions. No previous publication standards exist for reporting realist syntheses. This standard was developed as part of the RAMESES (Realist And MEta-narrative Evidence Syntheses: Evolving Standards) project. The projects aim is to produce preliminary publication standards for realist systematic reviews.MethodsWe (a) collated and summarized existing literature on the principles of good practice in realist syntheses; (b) considered the extent to which these principles had been followed by published syntheses, thereby identifying how rigor may be lost and how existing methods could be improved; (c) used a three-round online Delphi method with an interdisciplinary panel of national and international experts in evidence synthesis, realist research, policy and/or publishing to produce and iteratively refine a draft set of methodological steps and publication standards; (d) provided real-time support to ongoing realist syntheses and the open-access RAMESES online discussion list so as to capture problems and questions as they arose; and (e) synthesized expert input, evidence syntheses and real-time problem analysis into a definitive set of standards.ResultsWe identified 35 published realist syntheses, provided real-time support to 9 on-going syntheses and captured questions raised in the RAMESES discussion list. Through analysis and discussion within the project team, we summarized the published literature and common questions and challenges into briefing materials for the Delphi panel, comprising 37 members. Within three rounds this panel had reached consensus on 19 key publication standards, with an overall response rate of 91%.ConclusionThis project used multiple sources to develop and draw together evidence and expertise in realist synthesis. For each item we have included an explanation for why it is important and guidance on how it might be reported. Realist synthesis is a relatively new method for evidence synthesis and as experience and methodological developments occur, we anticipate that these standards will evolve to reflect further methodological developments. We hope that these standards will act as a resource that will contribute to improving the reporting of realist syntheses.To encourage dissemination of the RAMESES publication standards, this article is co-published in the Journal of Advanced Nursing and is freely accessible on Wiley Online Library (http://www.wileyonlinelibrary.com/journal/jan). Please see related article http://www.biomedcentral.com/1741-7015/11/20 and http://www.biomedcentral.com/1741-7015/11/22


Evaluation | 2003

Nothing as Practical as a Good Theory

Ray Pawson

This contribution is based on the second plenary address given at the 5th biennial meeting of the European Evaluation Society, 12 October 2002 in Seville, Spain.


Medical Education | 2012

Realist methods in medical education research: what are they and what can they contribute?

Geoff Wong; Trisha Greenhalgh; Gill Westhorp; Ray Pawson

Medical Education 2012: 46: 89–96


Evaluation | 2012

A realist diagnostic workshop

Ray Pawson; Ana Manzano-Santaella

The realist approach can now be said to be part of the repertoire of evaluation methods. There has been a corresponding shift in methodological focus. Polemical thrust and counter-thrust about the realist contribution as compared to that of other evaluative approaches such as randomized trials and meta-analysis has given way to closer examination of its practice ‘on the ground’. This article seeks to make a contribution to the literature on how to conduct realist inquiry through a constructive critique of recently published ‘realist evaluations’.


International Journal of Social Research Methodology | 2006

Digging for Nuggets: How ‘Bad’ Research Can Yield ‘Good’ Evidence

Ray Pawson

A good systematic review is often likened to the pre‐flight instrument check—ensuring a plane is airworthy before take‐off. By analogy, research synthesis follows a disciplined, formalized, transparent and highly routinized sequence of steps in order that its findings can be considered trustworthy—before being launched on the policy community. The most characteristic aspect of that schedule is the appraise‐then‐analyse sequence. The research quality of the primary studies is checked out and only those deemed to be of high standard may enter the analysis, the remainder being discarded. This paper rejects this logic, arguing that the ‘study’ is not the appropriate unit of analysis for quality appraisal in research synthesis. There are often nuggets of wisdom in methodologically weak studies and systematic review disregards them at its peril. Two evaluations of youth mentoring programmes are appraised at length. A catalogue of doubts is raised about their design and analysis. Their conclusions, which incidentally run counter to each other, are highly questionable. Yet there is a great deal to be learned about the efficacy of mentoring if one digs into the specifics of each study. ‘Bad’ research may yield ‘good’ evidence—but only if the reviewer follows an approach that involves analysis and appraisal.


Archives Europeennes De Sociologie | 2000

Middle-range realism

Ray Pawson

This paper proposes a liaison —‘middle-range realism’—between two long standing explanatory strategies in sociology— ‘middle-range theory’ and ‘realist social theory’. Each offers what the other lacks. Middle-range theory carries an acute sense of the function of theory within empirical inquiry but has left undeveloped any notion of its appropriate explanatory form. Realist social theory has propositional precision but has been unable, in the most part, to descend from a critical domain to the empirical plane. Middlerange realism thus offers a research strategy of the appropriate form and scope to lead and to federate empirical inquiry. Examples are provided of how middle-range realism can be applied to improve research using two different strategies (survey methods and evaluation research) in two contrasting substantive areas (voting behaviour and offender rehabilitation).


American Journal of Evaluation | 2011

Known knowns, known unknowns, unknown unknowns: The predicament of evidence-based policy

Ray Pawson; Geoff Wong; Lesley Owen

The authors present a case study examining the potential for policies to be “evidence-based.” To what extent is it possible to say that a decision to implement a complex social intervention is warranted on the basis of available empirical data? The case chosen is whether there is sufficient evidence to justify banning smoking in cars carrying children. The numerous assumptions underpinning such legislation are elicited, the weight and validity of evidence for each is appraised, and a mixed picture emerges. Certain propositions seem well supported; others are not yet proven and possibly unknowable. The authors argue that this is the standard predicament of evidence-based policy. Evidence does not come in finite chunks offering certainty and security to policy decisions. Rather, evidence-based policy is an accumulative process in which the data pursue but never quite capture unfolding policy problems. The whole point is the steady conversion of “unknowns” to “knowns.”


Evaluation | 1998

Caring communities, paradigm polemics, design debates

Ray Pawson; Nick Tilley

In a previous issue of this journal (Vol. 3, no. 2) David Farrington introduced—‘in the interests of stimulating discussions’—an outline research design for Evaluating a Community Crime Prevention Program. This article takes up the challenge, offers a critique of his ‘quasi-experimental’ approach and insists that such programs are better evaluated using designs based on ‘scientific realist’ principles. The paper also draws attention to benefits of a ‘theories of change’ approach to evaluation, pioneered in the US in conjunction with such comprehensive community initiatives.


Journal of Social Policy | 2005

The Perilous Road from Evidence to Policy: Five Journeys Compared

Annette Boaz; Ray Pawson

Comprehensive reviews of the available research are generally considered to be the cornerstone of contemporary efforts to establish evidence- based policy. This article provides an examination of the potential of this stratagem using the case study of ‘mentoring’ programmes. Mentoring initiatives (and allied schemes such as ‘coaching counselling peer education and so on) are to be found in every corner of public policy. Researchers have been no less energetic producing a huge body of evidence on the process and outcomes of such interventions. Reviewers accordingly have plenty to get their teeth into and by now there are numerous reports offering review-based advice on the benefits of mentoring. The article asks whether the sum total of these efforts as represented by five contemporary reviews is a useful tool for guiding policy and practice. Our analysis is a cause for some pessimism. We note a propensity for delivering unequivocal policy verdicts on the basis of ambiguous evidence. Even more disconcertingly the five reviews head off on different judgmental tangents one set of recommendations appearing to gainsay the next. The article refrains from recommending the ejection of evidence baby and policy bathwater but suggests that much closer attention needs to be paid to the explanatory scope of systematic reviews. (authors)

Collaboration


Dive into the Ray Pawson's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Geoff Wong

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nick Tilley

University College London

View shared research outputs
Researchain Logo
Decentralizing Knowledge