Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gill Harvey is active.

Publication


Featured researches published by Gill Harvey.


Journal of Health Services Research & Policy | 2005

Realist review - a new method of systematic review designed for complex policy interventions:

Ray Pawson; Trisha Greenhalgh; Gill Harvey; Kieran Walshe

Evidence-based policy is a dominant theme in contemporary public services but the practical realities and challenges involved in using evidence in policy-making are formidable. Part of the problem is one of complexity. In health services and other public services, we are dealing with complex social interventions which act on complex social systems-things like league tables, performance measures, regulation and inspection, or funding reforms. These are not ‘magic bullets‘ which will always hit their target, but programmes whose effects are crucially dependent on context and implementation. Traditional methods of review focus on measuring and reporting on programme effectiveness, often find that the evidence is mixed or conflicting, and provide little or no clue as to why the intervention worked or did not work when applied in different contexts or circumstances, deployed by different stakeholders, or used for different purposes. This paper offers a model of research synthesis which is designed to work with complex social interventions or programmes, and which is based on the emerging ‘realist’ approach to evaluation. It provides an explanatory analysis aimed at discerning what works for whom, in what circumstances, in what respects and how. The first step is to make explicit the programme theory (or theories) - the underlying assumptions about how an intervention is meant to work and what impacts it is expected to have. We then look for empirical evidence to populate this theoretical framework, supporting, contradicting or modifying the programme theories as it goes. The results of the review combine theoretical understanding and empirical evidence, and focus on explaining the relationship between the context in which the intervention is applied, the mechanisms by which it works and the outcomes which are produced. The aim is to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively. Realist review does not provide simple answers to complex questions. It will not tell policy-makers or managers whether something works or not, but will provide the policy and practice community with the kind of rich, detailed and highly practical understanding of complex social interventions which is likely to be of much more use to them when planning and implementing programmes at a national, regional or local level.


BMJ Quality & Safety | 1998

Enabling the implementation of evidence based practice: a conceptual framework.

Alison Kitson; Gill Harvey; Brendan McCormack

The argument put forward in this paper is that successful implementation of research into practice is a function of the interplay of three core elements--the level and nature of the evidence, the context or environment into which the research is to be placed, and the method or way in which the process is facilitated. It also proposes that because current research is inconclusive as to which of these elements is most important in successful implementation they all should have equal standing. This is contrary to the often implicit assumptions currently being generated within the clinical effectiveness agenda where the level and rigour of the evidence seems to be the most important factor for consideration. The paper offers a conceptual framework that considers this imbalance, showing how it might work in clarifying some of the theoretical positions and as a checklist for staff to assess what they need to do to successfully implement research into practice.


Implementation Science | 2008

Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges

Alison Kitson; Jo Rycroft-Malone; Gill Harvey; Brendan McCormack; Kate Seers; Angie Titchen

BackgroundThe PARiHS framework (Promoting Action on Research Implementation in Health Services) has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified.DiscussionThis being the case, the paper provides an integrated summary of our conceptual and theoretical thinking so far and introduces a typology (derived from social policy analysis) used to distinguish between the terms conceptual framework, theory and model – important definitional and conceptual issues in trying to refine theoretical and methodological approaches to knowledge translation.Secondly, the paper describes the next phase of our work, in particular concentrating on the conceptual thinking and mapping that has led to the generation of the hypothesis that the PARiHS framework is best utilised as a two-stage process: as a preliminary (diagnostic and evaluative) measure of the elements and sub-elements of evidence (E) and context (C), and then using the aggregated data from these measures to determine the most appropriate facilitation method. The exact nature of the intervention is thus determined by the specific actors in the specific context at a specific time and place.In the process of refining this next phase of our work, we have had to consider the wider issues around the use of theories to inform and shape our research activity; the ongoing challenges of developing robust and sensitive measures; facilitation as an intervention for getting research into practice; and finally to note how the current debates around evidence into practice are adopting wider notions that fit innovations more generally.SummaryThe paper concludes by suggesting that the future direction of the work on the PARiHS framework is to develop a two-stage diagnostic and evaluative approach, where the intervention is shaped and moulded by the information gathered about the specific situation and from participating stakeholders. In order to expedite the generation of new evidence and testing of emerging theories, we suggest the formation of an international research implementation science collaborative that can systematically collect and analyse experiences of using and testing the PARiHS framework and similar conceptual and theoretical approaches.We also recommend further refinement of the definitions around conceptual framework, theory, and model, suggesting a wider discussion that embraces multiple epistemological and ontological perspectives.


Quality & Safety in Health Care | 2002

Ingredients for change: revisiting a conceptual framework

Joanne Rycroft-Malone; Alison Kitson; Gill Harvey; Brendan McCormack; Kate Seers; Angie Titchen; Carole A. Estabrooks

Finding ways to deliver care based on the best possible evidence remains an ongoing challenge. Further theoretical developments of a conceptual framework are presented which influence the uptake of evidence into practice. A concept analysis has been conducted on the key elements of the framework—evidence, context, and facilitation—leading to refinement of the framework. While these three essential elements remain key to the process of implementation, changes have been made to their constituent sub-elements, enabling the detail of the framework to be revised. The concept analysis has shown that the relationship between the elements and sub-elements and their relative importance need to be better understood when implementing evidence based practice. Increased understanding of these relationships would help staff to plan more effective change strategies. Anecdotal reports suggest that the framework has a good level of validity. It is planned to develop it into a practical tool to aid those involved in planning, implementing, and evaluating the impact of changes in health care.


Implementation Science | 2012

FIRE (Facilitating Implementation of Research Evidence): a study protocol.

Kate Seers; Karen Cox; Nicola Crichton; Rhiannon Tudor Edwards; Ann Catrine Eldh; Carole A. Estabrooks; Gill Harvey; Claire Hawkes; Alison Kitson; Pat Linck; Geraldine McCarthy; Brendan McCormack; Carole Mockford; Jo Rycroft-Malone; Angie Titchen; Lars Wallin

BackgroundResearch evidence underpins best practice, but is not always used in healthcare. The Promoting Action on Research Implementation in Health Services (PARIHS) framework suggests that the nature of evidence, the context in which it is used, and whether those trying to use evidence are helped (or facilitated) affect the use of evidence. Urinary incontinence has a major effect on quality of life of older people, has a high prevalence, and is a key priority within European health and social care policy. Improving continence care has the potential to improve the quality of life for older people and reduce the costs associated with providing incontinence aids.ObjectivesThis study aims to advance understanding about the contribution facilitation can make to implementing research findings into practice via: extending current knowledge of facilitation as a process for translating research evidence into practice; evaluating the feasibility, effectiveness, and cost-effectiveness of two different models of facilitation in promoting the uptake of research evidence on continence management; assessing the impact of contextual factors on the processes and outcomes of implementation; and implementing a pro-active knowledge transfer and dissemination strategy to diffuse study findings to a wide policy and practice community.Setting and sampleFour European countries, each with six long-term nursing care sites (total 24 sites) for people aged 60 years and over with documented urinary incontinenceMethods and designPragmatic randomised controlled trial with three arms (standard dissemination and two different programmes of facilitation), with embedded process and economic evaluation. The primary outcome is compliance with the continence recommendations. Secondary outcomes include proportion of residents with incontinence, incidence of incontinence-related dermatitis, urinary tract infections, and quality of life. Outcomes are assessed at baseline, then at 6, 12, 18, and 24 months after the start of the facilitation interventions. Detailed contextual and process data are collected throughout, using interviews with staff, residents and next of kin, observations, assessment of context using the Alberta Context Tool, and documentary evidence. A realistic evaluation framework is used to develop explanatory theory about what works for whom in what circumstances.Trial registrationCurrent Controlled Trials ISRCTN11598502.


Implementation Science | 2011

The NIHR collaboration for leadership in applied health research and care (CLAHRC) for Greater Manchester: combining empirical, theoretical and experiential evidence to design and evaluate a large-scale implementation strategy

Gill Harvey; Louise Fitzgerald; Sandra L. Fielden; Anne McBride; Heather Waterman; David Bamford; Roman Kislov; Ruth Boaden

BackgroundIn response to policy recommendations, nine National Institute for Health Research (NIHR) Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) were established in England in 2008, aiming to create closer working between the health service and higher education and narrow the gap between research and its implementation in practice. The Greater Manchester (GM) CLAHRC is a partnership between the University of Manchester and twenty National Health Service (NHS) trusts, with a five-year mission to improve healthcare and reduce health inequalities for people with cardiovascular conditions. This paper outlines the GM CLAHRC approach to designing and evaluating a large-scale, evidence- and theory-informed, context-sensitive implementation programme.DiscussionThe paper makes a case for embedding evaluation within the design of the implementation strategy. Empirical, theoretical, and experiential evidence relating to implementation science and methods has been synthesised to formulate eight core principles of the GM CLAHRC implementation strategy, recognising the multi-faceted nature of evidence, the complexity of the implementation process, and the corresponding need to apply approaches that are situationally relevant, responsive, flexible, and collaborative. In turn, these core principles inform the selection of four interrelated building blocks upon which the GM CLAHRC approach to implementation is founded. These determine the organizational processes, structures, and roles utilised by specific GM CLAHRC implementation projects, as well as the approach to researching implementation, and comprise: the Promoting Action on Research Implementation in Health Services (PARIHS) framework; a modified version of the Model for Improvement; multiprofessional teams with designated roles to lead, facilitate, and support the implementation process; and embedded evaluation and learning.SummaryDesigning and evaluating a large-scale implementation strategy that can cope with and respond to the local complexities of implementing research evidence into practice is itself complex and challenging. We present an argument for adopting an integrative, co-production approach to planning and evaluating the implementation of research into practice, drawing on an eclectic range of evidence sources.


Implementation Science | 2015

PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice

Gill Harvey; Alison Kitson

BackgroundThe Promoting Action on Research Implementation in Health Services, or PARIHS framework, was first published in 1998. Since this time, work has been ongoing to further develop, refine and test it. Widely used as an organising or conceptual framework to help both explain and predict why the implementation of evidence into practice is or is not successful, PARIHS was one of the first frameworks to make explicit the multi-dimensional and complex nature of implementation as well as highlighting the central importance of context. Several critiques of the framework have also pointed out its limitations and suggested areas for improvement.DiscussionBuilding on the published critiques and a number of empirical studies, this paper introduces a revised version of the framework, called the integrated or i-PARIHS framework. The theoretical antecedents of the framework are described as well as outlining the revised and new elements, notably, the revision of how evidence is described; how the individual and teams are incorporated; and how context is further delineated. We describe how the framework can be operationalised and draw on case study data to demonstrate the preliminary testing of the face and content validity of the revised framework.SummaryThis paper is presented for deliberation and discussion within the implementation science community. Responding to a series of critiques and helpful feedback on the utility of the original PARIHS framework, we seek feedback on the proposed improvements to the framework. We believe that the i-PARIHS framework creates a more integrated approach to understand the theoretical complexity from which implementation science draws its propositions and working hypotheses; that the new framework is more coherent and comprehensive and at the same time maintains it intuitive appeal; and that the models of facilitation described enable its more effective operationalisation.


Quality & Safety in Health Care | 2003

Methods for evaluation of small scale quality improvement projects

Gill Harvey; M.J.P. Wensing

Evaluation is an integral component of quality improvement and there is much to be learned from the evaluation of small scale quality improvement initiatives at a local level. This type of evaluation is useful for a number of different reasons including monitoring the impact of local projects, identifying and dealing with issues as they arise within a project, comparing local projects to draw lessons, and collecting more detailed information as part of a bigger evaluation project. Focused audits and developmental studies can be used for evaluation within projects, while methods such as multiple case studies and process evaluations can be used to draw generalised lessons from local experiences and to provide examples of successful projects. Evaluations of small scale quality improvement projects help those involved in improvement initiatives to optimise their choice of interventions and use of resources. Important information to add to the knowledge base of quality improvement in health care can be derived by undertaking formal evaluation of local projects, particularly in relation to building theory around the processes of implementation and increasing understanding of the complex change processes involved.


Public Money & Management | 2004

Organizational Failure and Turnaround: Lessons for Public Services from the For-Profit Sector

Kieran Walshe; Gill Harvey; Paula Hyde; Naresh R. Pandit

As the performance of public services is increasingly scrutinized, it is now commonplace for some schools, hospitals, local authorities and other public organizations to be deemed ‘failing’ and for attempts to be made at creating a turnaround in their performance. This article explores the literature on failure and turnaround in for-profit organizations, presents a number of models or frameworks for describing and categorizing failure and turnaround, and examines the relevance and transferability of theoretical and empirical studies in the for-profit sector to the emerging field of failure and turnaround in public services.


Journal of Interprofessional Care | 2000

The use of care pathways as tools to support the implementation of evidence-based practice

V. L. Currie; Gill Harvey

This article presents some of the data from a study exploring the experiences and views of a range of professional staff using care pathways in their everyday practice. It focuses on the views of doctors, nurses, and therapists. Within the context of delivering integrated care, several themes are explored in relation to the successful implementation of evidence-based care pathways.

Collaboration


Dive into the Gill Harvey's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kieran Walshe

University of Manchester

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sue Llewellyn

University of Southampton

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge