Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joanne Greenhalgh is active.

Publication


Featured researches published by Joanne Greenhalgh.


Quality of Life Research | 2012

Implementing patient-reported outcomes assessment in clinical practice: a review of the options and considerations

Claire F. Snyder; Neil K. Aaronson; Ali K. Choucair; Thomas Elliott; Joanne Greenhalgh; Michele Y. Halyard; Rachel Hess; Deborah Miller; Bryce B. Reeve; Maria Santana

PurposeWhile clinical care is frequently directed at making patients “feel better,” patients’ reports on their functioning and well-being (patient-reported outcomes [PROs]) are rarely collected in routine clinical practice. The International Society for Quality of Life Research (ISOQOL) has developed a User’s Guide for Implementing Patient-Reported Outcomes Assessment in Clinical Practice. This paper summarizes the key issues from the User’s Guide.MethodsUsing the literature, an ISOQOL team outlined considerations for using PROs in clinical practice; options for designing the intervention; and strengths, weaknesses, and resource requirements associated with each option.ResultsImplementing routine PRO assessment involves a number of methodological and practical decisions, including (1) identifying the goals for collecting PROs in clinical practice, (2) selecting the patients, setting, and timing of assessments, (3) determining which questionnaire(s) to use, (4) choosing a mode for administering and scoring the questionnaire, (5) designing processes for reporting results, (6) identifying aids to facilitate score interpretation, (7) developing strategies for responding to issues identified by the questionnaires, and (8) evaluating the impact of the PRO intervention on the practice.ConclusionsIntegrating PROs in clinical practice has the potential to enhance patient-centered care. The online version of the User’s Guide will be updated periodically.


Quality of Life Research | 2009

The applications of PROs in clinical practice: what are they, do they work, and why?

Joanne Greenhalgh

BackgroundPrecisely defining the different applications of patient-reported outcome measures (PROs) in clinical practice can be difficult. This is because the intervention is complex and varies amongst different studies in terms of the type of PRO used, how the PRO is fed back, and to whom it is fed back.MethodsA theory-driven approach is used to describe six different applications of PROs in clinical practice. The evidence for the impact of these applications on the process and outcomes of care are summarised. Possible explanations for the limited impact of PROs on patient management are then discussed and directions for future research are highlighted.ResultsThe applications of PROs in clinical practice include screening tools, monitoring tools, as a method of promoting patient-centred care, as a decision aid, as a method of facilitating communication amongst multidisciplinary teams (MDTs), and as a means of monitoring the quality of patient care. Evidence from randomised controlled trials suggests that the use of PROs in clinical practice is valuable in improving the discussion and detection of HRQoL problems but has less of an impact on how clinicians manage patient problems or on subsequent patient outcomes. Many of the reasons for this may lie in the ways in which PROs fit (or do not fit) into the routine ways in which patients and clinicians communicate with each other, how clinicians make decisions, and how healthcare as a whole is organised.ConclusionsFuture research needs to identify ways in with PROs can be better incorporated into the routine care of patients by combining qualitative and quantitative methods and adopting appropriate trial designs.


BMJ Quality & Safety | 2014

The experiences of professionals with using information from patient-reported outcome measures to improve the quality of healthcare: a systematic review of qualitative research

Maria B Boyce; John Browne; Joanne Greenhalgh

Objectives To synthesise qualitative studies that investigated the experiences of healthcare professionals with using information from patient-reported outcome measures (PROMs) to improve the quality of care. Design A qualitative systematic review was conducted by searching PubMed, PsycINFO and CINAHL with no time restrictions. Hand searching was also performed. Eligible studies were evaluated using the Critical Appraisal Skills Programme toolkit for qualitative studies. A thematic synthesis identified common themes across studies. Study characteristics were examined to explain differences in findings. Setting All healthcare settings. Participants Healthcare professionals. Outcomes Professionals’ views of PROMs after receiving PROMs feedback about individual patients or groups of patients. Results Sixteen studies met the inclusion criteria. Barriers and facilitators to the use of PROMs emerged within four main themes: collecting and incorporating the data (practical), valuing the data (attitudinal), making sense of the data (methodological) and using the data to make changes to patient care (impact). Conclusions Professionals value PROMs when they are useful for the clinical decision-making process. Practical barriers to the routine use of PROMs are prominent when the correct infrastructure is not in place before commencing data collection and when their use is disruptive to normal work routines. Technology can play a greater role in processing the information in the most efficient manner. Improvements to the interpretability of PROMs should increase their use. Attitudes to the use of PROMs may be improved by engaging professionals in the planning stage of the intervention and by ensuring a high level of transparency around the rationale for data collection.


Implementation Science | 2015

What's in a mechanism? Development of a key concept in realist evaluation

Sonia Dalkin; Joanne Greenhalgh; Diana Jones; Bill Cunningham; Monique Lhussier

BackgroundThe idea that underlying, generative mechanisms give rise to causal regularities has become a guiding principle across many social and natural science disciplines. A specific form of this enquiry, realist evaluation is gaining momentum in the evaluation of complex social interventions. It focuses on ‘what works, how, in which conditions and for whom’ using context, mechanism and outcome configurations as opposed to asking whether an intervention ‘works’. Realist evaluation can be difficult to codify and requires considerable researcher reflection and creativity. As such there is often confusion when operationalising the method in practice. This article aims to clarify and further develop the concept of mechanism in realist evaluation and in doing so aid the learning of those operationalising the methodology.DiscussionUsing a social science illustration, we argue that disaggregating the concept of mechanism into its constituent parts helps to understand the difference between the resources offered by the intervention and the ways in which this changes the reasoning of participants. This in turn helps to distinguish between a context and mechanism. The notion of mechanisms ‘firing’ in social science research is explored, with discussions surrounding how this may stifle researchers’ realist thinking. We underline the importance of conceptualising mechanisms as operating on a continuum, rather than as an ‘on/off’ switch.SummaryThe discussions in this article will hopefully progress and operationalise realist methods. This development is likely to occur due to the infancy of the methodology and its recent increased profile and use in social science research. The arguments we present have been tested and are explained throughout the article using a social science illustration, evidencing their usability and value.


Sociological Research Online | 2007

Accessing Socially Excluded People — Trust and the Gatekeeper in the Researcher-Participant Relationship

Nick Emmel; Kahryn Hughes; Joanne Greenhalgh; Adam Sales

This paper describes methodological findings from research to recruit and research hard-to-reach socially excluded people. We review the ways in which researchers have used particular strategies to access hard-to-reach individuals and groups and note that little attention has been given to understanding the implications of the nature of the trust relationship between researcher and participant. Gatekeepers invariably play a role in accessing socially excluded people in research, yet discussion to date invariably focuses on the instrumental role gatekeepers play in facilitating researchers’ access. In this paper we explore the possibilities for analysing relationships in terms of trust and distrust between gatekeeper and socially excluded participant. Our analysis considers the different kinds of relationships that exist between gatekeepers and socially excluded people and, in particular, the relationships of power between gatekeepers and socially excluded people. Insights into the nature of trust among socially excluded people will also be considered. Finally, we discuss how size and use of social networks among socially excluded groups and perceptions of risk in interactions with gatekeepers are important to understanding the possibilities for trustful relationships, and for meaningful and successful access for researchers to socially excluded individuals and groups.


Quality of Life Research | 2013

How do doctors refer to patient-reported outcome measures (PROMS) in oncology consultations?

Joanne Greenhalgh; Purva Abhyankar; Serena McCluskey; Elena Takeuchi; Galina Velikova

PurposeWe conducted a secondary qualitative analysis of consultations between oncologists and their patients to explore how patient-reported outcome measures (PROMs) data were referred to in the process of (1) eliciting and exploring patients’ concerns; (2) making decisions about supportive treatment and (3) making decisions about chemotherapy and other systemic treatments.MethodsWe purposively sampled audio recordings of 18 consultations from the intervention arm and 4 from the attention control arm of a previous UK randomised controlled trial of the feedback of PROMs data to doctors (Velikova et al. in J Clin Oncol 22(4):714–724 [1]). We used a combination of content and conversation analysis to examine how opportunities for discussion of health-related quality of life issues are opened up or closed down within the consultation and explore why this may or may not lead to changes in patient management.FindingsExplicit reference to the PROMs data provided an opportunity for the patient to clarify and further elaborate on the side effects of chemotherapy. High scores on the PROMs data were not explored further if the patient indicated they were not a problem or were not related to the cancer or chemotherapy. Symptomatic treatment was more often offered for problems like nausea, constipation, pain and depression but much less so for fatigue. Doctors discussed fatigue by providing a cause for the fatigue (e.g. the chemotherapy), presenting this as ‘something to be expected’, minimising its impact or moving on to another topic. Chemotherapy regimens were not changed on the basis of the PROMs data alone, but PROMs data were sometimes used to legitimise changes.ConclusionsExplicit mention of PROMs data in the consultation may strengthen opportunities for patients to elaborate on their problems, but doctors may not always know how to do this. Our findings have informed the development of a training package to enable doctors to optimise their use of PROMs data within the consultation.


BMJ Quality & Safety | 1998

Searching for information on outcomes: do you need to be comprehensive?

Alison Brettle; Andrew F. Long; Maria J. Grant; Joanne Greenhalgh

The concepts of evidence-based practice and clinical effectiveness are reliant on up to date, accurate, high quality, and relevant information. Although this information can be obtained from a range of sources, computerised databases such as MEDLINE offer a fast, effective means of bringing up to date information to clinicians, as well as health service and information professionals. Common problems when searching for information from databases include missing important relevant papers or retrieving too much information. Effective search strategies are therefore necessary to retrieve a manageable amount of relevant information. This paper presents a range of strategies which can be used to locate information on MEDLINE efficiently and effectively.


BMJ Open | 2015

Protocol—the RAMESES II study: developing guidance and reporting standards for realist evaluation

Trish Greenhalgh; Geoff Wong; Justin Jagosh; Joanne Greenhalgh; Ana Manzano; Gillian Westhorp; Ray Pawson

Introduction Realist evaluation is an increasingly popular methodology in health services research. For realist evaluations (RE) this project aims to: develop quality and reporting standards and training materials; build capacity for undertaking and critically evaluating them; produce resources and training materials for lay participants, and those seeking to involve them. Methods To achieve our aims, we will: (1) Establish management and governance infrastructure; (2) Recruit an interdisciplinary Delphi panel of 35 participants with diverse relevant experience of RE; (3) Summarise current literature and expert opinion on best practice in RE; (4) Run an online Delphi panel to generate and refine items for quality and reporting standards; (5) Capture ‘real world’ experiences and challenges of RE—for example, by providing ongoing support to realist evaluations, hosting the RAMESES JISCmail list on realist research, and feeding problems and insights from these into the deliberations of the Delphi panel; (6) Produce quality and reporting standards; (7) Collate examples of the learning and training needs of researchers, students, reviewers and lay members in relation to RE; (8) Develop, deliver and evaluate training materials for RE and deliver training workshops; and (9) Develop and evaluate information and resources for patients and other lay participants in RE (eg, draft template information sheets and model consent forms) and; (10) Disseminate training materials and other resources. Planned outputs: (1) Quality and reporting standards and training materials for RE. (2) Methodological support for RE. (3) Increase in capacity to support and evaluate RE. (4) Accessible, plain-English resources for patients and the public participating in RE. Discussion The realist evaluation is a relatively new approach to evaluation and its overall place in the is not yet fully established. As with all primary research approaches, guidance on quality assurance and uniform reporting is an important step towards improving quality and consistency.


BMJ Open | 2014

Functionality and feedback: a protocol for a realist synthesis of the collation, interpretation and utilisation of PROMs data to improve patient care

Joanne Greenhalgh; Ray Pawson; Judy Wright; Nick Black; Jose M. Valderas; David M Meads; Elizabeth Gibbons; Laurence Wood; Charlotte Wood; Chris Mills; Sonia Dalkin

Introduction The feedback and public reporting of PROMs data aims to improve the quality of care provided to patients. Existing systematic reviews have found it difficult to draw overall conclusions about the effectiveness of PROMs feedback. We aim to execute a realist synthesis of the evidence to understand by what means and in what circumstances the feedback of PROMs data leads to the intended service improvements. Methods and analysis Realist synthesis involves (stage 1) identifying the ideas, assumptions or ‘programme theories’ which explain how PROMs feedback is supposed to work and in what circumstances and then (stage 2) reviewing the evidence to determine the extent to which these expectations are met in practice. For stage 1, six provisional ‘functions’ of PROMs feedback have been identified to structure our review (screening, monitoring, patient involvement, demand management, quality improvement and patient choice). For each function, we will identify the different programme theories that underlie these different goals and develop a logical map of the respective implementation processes. In stage 2, we will identify studies that will provide empirical tests of each component of the programme theories to evaluate the circumstances in which the potential obstacles can be overcome and whether and how the unintended consequences of PROMs feedback arise. We will synthesise this evidence to (1) identify the implementation processes which support or constrain the successful collation, interpretation and utilisation of PROMs data; (2) identify the implementation processes through which the unintended consequences of PROMs data arise and those where they can be avoided. Ethics and dissemination The study will not require NHS ethics approval. We have secured ethical approval for the study from the University of Leeds (LTSSP-019). We will disseminate the findings of the review through a briefing paper and dissemination event for National Health Service stakeholders, conferences and peer reviewed publications.


Journal of Comparative Effectiveness Research | 2016

Framework and guidance for implementing patient-reported outcomes in clinical practice: evidence, challenges and opportunities.

Ian Porter; Daniela Gonçalves-Bradley; Ignacio Ricci-Cabello; Chris Gibbons; Jaheeda Gangannagaripalli; Ray Fitzpatrick; Nick Black; Joanne Greenhalgh; Jose M. Valderas

Patient-reported outcomes (PROs) are reports of the status of a patients health condition that come directly from the patient. While PRO measures are a well-developed technology with robust standards in research, their use for informing healthcare decisions is still poorly understood. We review relevant examples of their application in the provision of healthcare and examine the challenges associated with implementing PROs in clinical settings. We evaluate evidence for their use and examine barriers to their uptake, and present an evidence-based framework for the successful implementation of PROs in clinical practice. We discuss current and future developments for the use of PROs in clinical practice, such as individualized measurement and computer-adaptive testing.

Collaboration


Dive into the Joanne Greenhalgh's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge