Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Heather Munthe-Kaas is active.

Publication


Featured researches published by Heather Munthe-Kaas.


PLOS Medicine | 2015

Using qualitative evidence in decision making for health and social interventions: an approach to assess confidence in findings from qualitative evidence syntheses (GRADE-CERQual).

Simon Lewin; Claire Glenton; Heather Munthe-Kaas; Benedicte Carlsen; Christopher J. Colvin; Metin Gülmezoglu; Jane Noyes; Andrew Booth; Ruth Garside; Arash Rashidian

Simon Lewin and colleagues present a methodology for increasing transparency and confidence in qualitative research synthesis.


Reproductive Health | 2014

Facilitators and barriers to facility-based delivery in low- and middle-income countries: a qualitative evidence synthesis

Meghan A. Bohren; Erin C. Hunter; Heather Munthe-Kaas; João Paulo Souza; Joshua P. Vogel; A Metin Gülmezoglu

High-quality obstetric delivery in a health facility reduces maternal and perinatal morbidity and mortality. This systematic review synthesizes qualitative evidence related to the facilitators and barriers to delivering at health facilities in low- and middle-income countries. We aim to provide a useful framework for better understanding how various factors influence the decision-making process and the ultimate location of delivery at a facility or elsewhere. We conducted a qualitative evidence synthesis using a thematic analysis. Searches were conducted in PubMed, CINAHL and gray literature databases. Study quality was evaluated using the CASP checklist. The confidence in the findings was assessed using the CERQual method. Thirty-four studies from 17 countries were included. Findings were organized under four broad themes: (1) perceptions of pregnancy and childbirth; (2) influence of sociocultural context and care experiences; (3) resource availability and access; (4) perceptions of quality of care. Key barriers to facility-based delivery include traditional and familial influences, distance to the facility, cost of delivery, and low perceived quality of care and fear of discrimination during facility-based delivery. The emphasis placed on increasing facility-based deliveries by public health entities has led women and their families to believe that childbirth has become medicalized and dehumanized. When faced with the prospect of facility birth, women in low- and middle-income countries may fear various undesirable procedures, and may prefer to deliver at home with a traditional birth attendant. Given the abundant reports of disrespectful and abusive obstetric care highlighted by this synthesis, future research should focus on achieving respectful, non-abusive, and high-quality obstetric care for all women. Funding for this project was provided by The United States Agency for International Development (USAID) and the UNDP/UNFPA/UNICEF/WHO/World Bank Special Programme of Research, Development and Research Training in Human Reproduction, Department of Reproductive Health and Research, World Health Organization.


Implementation Science | 2018

Applying GRADE-CERQual to qualitative evidence synthesis findings: introduction to the series

Simon Lewin; Andrew Booth; Claire Glenton; Heather Munthe-Kaas; Arash Rashidian; Megan Wainwright; Meghan A. Bohren; Özge Tunçalp; Christopher J. Colvin; Ruth Garside; Benedicte Carlsen; Etienne V. Langlois; Jane Noyes

The GRADE-CERQual (‘Confidence in the Evidence from Reviews of Qualitative research’) approach provides guidance for assessing how much confidence to place in findings from systematic reviews of qualitative research (or qualitative evidence syntheses). The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. Confidence in the evidence from qualitative evidence syntheses is an assessment of the extent to which a review finding is a reasonable representation of the phenomenon of interest. CERQual provides a systematic and transparent framework for assessing confidence in individual review findings, based on consideration of four components: (1) methodological limitations, (2) coherence, (3) adequacy of data, and (4) relevance. A fifth component, dissemination (or publication) bias, may also be important and is being explored. As with the GRADE (Grading of Recommendations Assessment, Development, and Evaluation) approach for effectiveness evidence, CERQual suggests summarising evidence in succinct, transparent, and informative Summary of Qualitative Findings tables. These tables are designed to communicate the review findings and the CERQual assessment of confidence in each finding. This article is the first of a seven-part series providing guidance on how to apply the CERQual approach. In this paper, we describe the rationale and conceptual basis for CERQual, the aims of the approach, how the approach was developed, and its main components. We also outline the purpose and structure of this series and discuss the growing role for qualitative evidence in decision-making. Papers 3, 4, 5, 6, and 7 in this series discuss each CERQual component, including the rationale for including the component in the approach, how the component is conceptualised, and how it should be assessed. Paper 2 discusses how to make an overall assessment of confidence in a review finding and how to create a Summary of Qualitative Findings table. The series is intended primarily for those undertaking qualitative evidence syntheses or using their findings in decision-making processes but is also relevant to guideline development agencies, primary qualitative researchers, and implementation scientists and practitioners.


Journal of Homosexuality | 2016

Internalized Homonegativity: A Systematic Mapping Review of Empirical Research

Rigmor C. Berg; Heather Munthe-Kaas; Michael W. Ross

ABSTRACT Internalized homonegativity (IH) is an important variable affecting the wellbeing of lesbian, gay, and bisexual (LGB) persons. We included 201 studies in a systematic mapping review of IH. Most studies were conducted in North America and examined IH as a predictor of poor health. The primary focus of 14 studies was IH scale measurement, and, in total, these studies detailed nine distinct scales. Eighteen studies compared levels of IH in LGB populations, four described prevention programs, and one investigated IH using qualitative methods. Our review indicates that further research is needed, particularly qualitative research and ways to ameliorate IH.


Implementation Science | 2018

Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 2: how to make an overall CERQual assessment of confidence and create a Summary of Qualitative Findings table

Simon Lewin; Meghan A. Bohren; Arash Rashidian; Heather Munthe-Kaas; Claire Glenton; Christopher J. Colvin; Ruth Garside; Jane Noyes; Andrew Booth; Özge Tunçalp; Megan Wainwright; Signe Flottorp; Joseph D. Tucker; Benedicte Carlsen

BackgroundThe GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision making, including guideline development and policy formulation.CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on making an overall assessment of confidence in a review finding and creating a CERQual Evidence Profile and a CERQual Summary of Qualitative Findings table.MethodsWe developed this guidance by examining the methods used by other GRADE approaches, gathering feedback from relevant research communities and developing consensus through project group meetings. We then piloted the guidance on several qualitative evidence syntheses before agreeing on the approach.ResultsConfidence in the evidence is an assessment of the extent to which a review finding is a reasonable representation of the phenomenon of interest. Creating a summary of each review finding and deciding whether or not CERQual should be used are important steps prior to assessing confidence. Confidence should be assessed for each review finding individually, based on the judgements made for each of the four CERQual components. Four levels are used to describe the overall assessment of confidence: high, moderate, low or very low. The overall CERQual assessment for each review finding should be explained in a CERQual Evidence Profile and Summary of Qualitative Findings table.ConclusionsStructuring and summarising review findings, assessing confidence in those findings using CERQual and creating a CERQual Evidence Profile and Summary of Qualitative Findings table should be essential components of undertaking qualitative evidence syntheses. This paper describes the end point of a CERQual assessment and should be read in conjunction with the other papers in the series that provide information on assessing individual CERQual components.


Implementation Science | 2018

Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 6: how to assess relevance of the data

Jane Noyes; Andrew Booth; Simon Lewin; Benedicte Carlsen; Claire Glenton; Christopher J. Colvin; Ruth Garside; Meghan A. Bohren; Arash Rashidian; Megan Wainwright; Özge Tunςalp; Jacqueline Chandler; Signe Flottorp; Tomas Pantoja; Joseph D. Tucker; Heather Munthe-Kaas

BackgroundThe GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation.CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual’s relevance component.MethodsWe developed the relevance component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual relevance component within several qualitative evidence syntheses before agreeing on the current definition and principles for application.ResultsWhen applying CERQual, we define relevance as the extent to which the body of data from the primary studies supporting a review finding is applicable to the context (perspective or population, phenomenon of interest, setting) specified in the review question. In this paper, we describe the relevance component and its rationale and offer guidance on how to assess relevance in the context of a review finding. This guidance outlines the information required to assess relevance, the steps that need to be taken to assess relevance and examples of relevance assessments.ConclusionsThis paper provides guidance for review authors and others on undertaking an assessment of relevance in the context of the CERQual approach. Assessing the relevance component requires consideration of potentially important contextual factors at an early stage in the review process. We expect the CERQual approach, and its individual components, to develop further as our experiences with the practical implementation of the approach increase.


PLOS ONE | 2016

Extent, Awareness and Perception of Dissemination Bias in Qualitative Research: An Explorative Survey

Ingrid Toews; Claire Glenton; Simon Lewin; Rigmor C. Berg; Jane Noyes; Andrew Booth; Ana Marušić; Mario Malički; Heather Munthe-Kaas; Joerg J. Meerpohl

Background Qualitative research findings are increasingly used to inform decision-making. Research has indicated that not all quantitative research on the effects of interventions is disseminated or published. The extent to which qualitative researchers also systematically underreport or fail to publish certain types of research findings, and the impact this may have, has received little attention. Methods A survey was delivered online to gather data regarding non-dissemination and dissemination bias in qualitative research. We invited relevant stakeholders through our professional networks, authors of qualitative research identified through a systematic literature search, and further via snowball sampling. Results 1032 people took part in the survey of whom 859 participants identified as researchers, 133 as editors and 682 as peer reviewers. 68.1% of the researchers said that they had conducted at least one qualitative study that they had not published in a peer-reviewed journal. The main reasons for non-dissemination were that a publication was still intended (35.7%), resource constraints (35.4%), and that the authors gave up after the paper was rejected by one or more journals (32.5%). A majority of the editors and peer reviewers “(strongly) agreed” that the main reasons for rejecting a manuscript of a qualitative study were inadequate study quality (59.5%; 68.5%) and inadequate reporting quality (59.1%; 57.5%). Of 800 respondents, 83.1% “(strongly) agreed” that non-dissemination and possible resulting dissemination bias might undermine the willingness of funders to support qualitative research. 72.6% and 71.2%, respectively, “(strongly) agreed” that non-dissemination might lead to inappropriate health policy and health care. Conclusions The proportion of non-dissemination in qualitative research is substantial. Researchers, editors and peer reviewers play an important role in this. Non-dissemination and resulting dissemination bias may impact on health care research, practice and policy. More detailed investigations on patterns and causes of the non-dissemination of qualitative research are needed.


Implementation Science | 2018

Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 4: how to assess coherence

Christopher J. Colvin; Ruth Garside; Megan Wainwright; Heather Munthe-Kaas; Claire Glenton; Meghan A. Bohren; Benedicte Carlsen; Özge Tunçalp; Jane Noyes; Andrew Booth; Arash Rashidian; Signe Flottorp; Simon Lewin

BackgroundThe GRADE-CERQual (Grading of Recommendations Assessment, Development and Evaluation-Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE working group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation.CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) relevance, (3) coherence and (4) adequacy of data. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual’s coherence component.MethodsWe developed the coherence component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual coherence component within several qualitative evidence syntheses before agreeing on the current definition and principles for application.ResultsWhen applying CERQual, we define coherence as how clear and cogent the fit is between the data from the primary studies and a review finding that synthesises that data. In this paper, we describe the coherence component and its rationale and offer guidance on how to assess coherence in the context of a review finding as part of the CERQual approach. This guidance outlines the information required to assess coherence, the steps that need to be taken to assess coherence and examples of coherence assessments.ConclusionsThis paper provides guidance for review authors and others on undertaking an assessment of coherence in the context of the CERQual approach. We suggest that threats to coherence may arise when the data supporting a review finding are contradictory, ambiguous or incomplete or where competing theories exist that could be used to synthesise the data. We expect the CERQual approach, and its individual components, to develop further as our experiences with the practical implementation of the approach increase.


Implementation Science | 2018

Applying GRADE-CERQual to qualitative evidence synthesis findings — paper 5: how to assess adequacy of data.

Claire Glenton; Benedicte Carlsen; Simon Lewin; Heather Munthe-Kaas; Christopher J. Colvin; Özge Tunçalp; Meghan A. Bohren; Jane Noyes; Andrew Booth; Ruth Garside; Arash Rashidian; Signe Flottorp; Megan Wainwright

BackgroundThe GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) working group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation.CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations; (2) coherence; (3) adequacy of data; and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual’s adequacy of data component.MethodsWe developed the adequacy of data component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual adequacy of data component within several qualitative evidence syntheses before agreeing on the current definition and principles for application.ResultsWhen applying CERQual, we define adequacy of data as an overall determination of the degree of richness and the quantity of data supporting a review finding. In this paper, we describe the adequacy component and its rationale and offer guidance on how to assess data adequacy in the context of a review finding as part of the CERQual approach. This guidance outlines the information required to assess data adequacy, the steps that need to be taken to assess data adequacy, and examples of adequacy assessments.ConclusionsThis paper provides guidance for review authors and others on undertaking an assessment of adequacy in the context of the CERQual approach. We approach assessments of data adequacy in terms of the richness and quantity of the data supporting each review finding, but do not offer fixed rules regarding what constitutes sufficiently rich data or an adequate quantity of data. Instead, we recommend that this assessment is made in relation to the nature of the finding. We expect the CERQual approach, and its individual components, to develop further as our experiences with the practical implementation of the approach increase.


Implementation Science | 2018

Applying GRADE-CERQual to qualitative evidence synthesis findings—paper 3: how to assess methodological limitations

Heather Munthe-Kaas; Meghan A. Bohren; Claire Glenton; Simon Lewin; Jane Noyes; Özge Tunçalp; Andrew Booth; Ruth Garside; Christopher J. Colvin; Megan Wainwright; Arash Rashidian; Signe Flottorp; Benedicte Carlsen

BackgroundThe GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation.CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual’s methodological limitations component.MethodsWe developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application.ResultsWhen applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments.ConclusionsThis paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.

Collaboration


Dive into the Heather Munthe-Kaas's collaboration.

Top Co-Authors

Avatar

Claire Glenton

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar

Simon Lewin

South African Medical Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrew Booth

University of Sheffield

View shared research outputs
Top Co-Authors

Avatar

Rigmor C. Berg

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar

Arash Rashidian

World Health Organization

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge