Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Astrid Austvoll-Dahlgren is active.

Publication


Featured researches published by Astrid Austvoll-Dahlgren.


Journal of Evidence-based Medicine | 2015

Key concepts that people need to understand to assess claims about treatment effects.

Astrid Austvoll-Dahlgren; Andrew D Oxman; Iain Chalmers; Allen Nsangi; Claire Glenton; Simon Lewin; Angela Morelli; Sarah Rosenbaum; Daniel Semakula; Nelson Sewankambo

People are confronted with claims about the effects of treatments and health policies daily. Our objective was to develop a list of concepts that may be important for people to understand when assessing claims about treatment effects.


Health Information and Libraries Journal | 2013

Development of a complex intervention to improve health literacy skills.

Astrid Austvoll-Dahlgren; Stein Ove Danielsen; Elin Opheim; Arild Bjørndal; Liv Merete Reinar; Signe Flottorp; Andrew D Oxman; Sølvi Helseth

Background Providing insight into the developmental processes involved in building interventions is an important way to ensure methodological transparency and inform future research efforts. The objective of this study was to describe the development of a web portal designed to improve health literacy skills among the public. Methods The web portal was tailored to address three key barriers to obtaining information, using the conceptual frameworks of shared decision-making and evidence-based practice and based on explicit criteria for selecting the content and form of the intervention. Results The web portal targeted the general public and took the form of structured sets of tools. Content included: an introduction to research methods, help on how to find evidence-based health information efficiently based on the steps of evidence-based practice, an introduction to critical appraisal, information about patient participation rights in decision-making, and a decision aid for consultations. Conclusions The web portal was designed in a systematic and transparent way and address key barriers to obtaining and acting upon reliable health information. The web portal provides open access to the tools and can be used independently by health care users, or during consultations with health professionals.


PLOS ONE | 2012

Evaluation of a web portal for improving public access to evidence-based health information and health literacy skills: a pragmatic trial.

Astrid Austvoll-Dahlgren; Arild Bjørndal; Jan Odgaard-Jensen; Sølvi Helseth

Background Using the conceptual framework of shared decision-making and evidence-based practice, a web portal was developed to serve as a generic (non disease-specific) tailored intervention to improve the lay publics health literacy skills. Objective To evaluate the effects of the web portal compared to no intervention in a real-life setting. Methods A pragmatic randomised controlled parallel trial using simple randomisation of 96 parents who had children aged <4 years. Parents were allocated to receive either access to the portal or no intervention, and assigned three tasks to perform over a three-week period. These included a searching task, a critical appraisal task, and reporting on perceptions about participation. Data were collected from March through June 2011. Results Use of the web portal was found to improve attitudes towards searching for health information. This variable was identified as the most important predictor of intention to search in both samples. Participants considered the web portal to have good usability, usefulness, and credibility. The intervention group showed slight increases in the use of evidence-based information, critical appraisal skills, and participation compared to the group receiving no intervention, but these differences were not statistically significant. Conclusion Despite the fact that the study was underpowered, we found that the web portal may have a positive effect on attitudes towards searching for health information. Furthermore, participants considered the web portal to be a relevant tool. It is important to continue experimenting with web-based resources in order to increase user participation in health care decision-making. Trial Registration ClinicalTrials.gov NCT01266798


BMJ Open | 2017

Measuring ability to assess claims about treatment effects: the development of the ‘Claim Evaluation Tools’

Astrid Austvoll-Dahlgren; Daniel Semakula; Allen Nsangi; Andrew D Oxman; Iain Chalmers; Sarah Rosenbaum; Øystein Guttersrud

Objectives To describe the development of the Claim Evaluation Tools, a set of flexible items to measure peoples ability to assess claims about treatment effects. Setting Methodologists and members of the community (including children) in Uganda, Rwanda, Kenya, Norway, the UK and Australia. Participants In the iterative development of the items, we used purposeful sampling of people with training in research methodology, such as teachers of evidence-based medicine, as well as patients and members of the public from low-income and high-income countries. Development consisted of 4 processes: (1) determining the scope of the Claim Evaluation Tools and development of items; (2) expert item review and feedback (n=63); (3) cognitive interviews with children and adult end-users (n=109); and (4) piloting and administrative tests (n=956). Results The Claim Evaluation Tools database currently includes a battery of multiple-choice items. Each item begins with a scenario which is intended to be relevant across contexts, and which can be used for children (from age 10  and above), adult members of the public and health professionals. People with expertise in research methods judged the items to have face validity, and end-users judged them relevant and acceptable in their settings. In response to feedback from methodologists and end-users, we simplified some text, explained terms where needed, and redesigned formats and instructions. Conclusions The Claim Evaluation Tools database is a flexible resource from which researchers, teachers and others can design measurement instruments to meet their own requirements. These evaluation tools are being managed and made freely available for non-commercial use (on request) through Testing Treatments interactive (testingtreatments.org). Trial registration numbers PACTR201606001679337 and PACTR201606001676150; Pre-results.


PLOS ONE | 2017

Establishing a library of resources to help people understand key concepts in assessing treatment claims-The "Critical thinking and Appraisal Resource Library" (CARL).

John Castle; Iain Chalmers; Patricia Atkinson; Douglas Badenoch; Andrew D Oxman; Astrid Austvoll-Dahlgren; Lena Nordheim; L Kendall Krause; Lisa M. Schwartz; Steven Woloshin; Amanda Burls; Paola Mosconi; Tammy Hoffmann; Leila Cusack; Loai Albarqouni; Paul Glasziou

Background People are frequently confronted with untrustworthy claims about the effects of treatments. Uncritical acceptance of these claims can lead to poor, and sometimes dangerous, treatment decisions, and wasted time and money. Resources to help people learn to think critically about treatment claims are scarce, and they are widely scattered. Furthermore, very few learning-resources have been assessed to see if they improve knowledge and behavior. Objectives Our objectives were to develop the Critical thinking and Appraisal Resource Library (CARL). This library was to be in the form of a database containing learning resources for those who are responsible for encouraging critical thinking about treatment claims, and was to be made available online. We wished to include resources for groups we identified as ‘intermediaries’ of knowledge, i.e. teachers of schoolchildren, undergraduates and graduates, for example those teaching evidence-based medicine, or those communicating treatment claims to the public. In selecting resources, we wished to draw particular attention to those resources that had been formally evaluated, for example, by the creators of the resource or independent research groups. Methods CARL was populated with learning-resources identified from a variety of sources—two previously developed but unmaintained inventories; systematic reviews of learning-interventions; online and database searches; and recommendations by members of the project group and its advisors. The learning-resources in CARL were organised by ‘Key Concepts’ needed to judge the trustworthiness of treatment claims, and were made available online by the James Lind Initiative in Testing Treatments interactive (TTi) English (www.testingtreatments.org/category/learning-resources).TTi English also incorporated the database of Key Concepts and the Claim Evaluation Tools developed through the Informed Healthcare Choices (IHC) project (informedhealthchoices.org). Results We have created a database of resources called CARL, which currently contains over 500 open-access learning-resources in a variety of formats: text, audio, video, webpages, cartoons, and lesson materials. These are aimed primarily at ‘Intermediaries’, that is, ‘teachers’, ‘communicators’, ‘advisors’, ‘researchers’, as well as for independent ‘learners’. The resources included in CARL are currently accessible at www.testingtreatments.org/category/learning-resources Conclusions We hope that ready access to CARL will help to promote the critical thinking about treatment claims, needed to help improve healthcare choices.


BMJ Open | 2017

Measuring ability to assess claims about treatment effects: a latent trait analysis of items from the 'Claim Evaluation Tools' database using Rasch modelling

Astrid Austvoll-Dahlgren; Øystein Guttersrud; Allen Nsangi; Daniel Semakula; Andrew D Oxman

Background The Claim Evaluation Tools database contains multiple-choice items for measuring people’s ability to apply the key concepts they need to know to be able to assess treatment claims. We assessed items from the database using Rasch analysis to develop an outcome measure to be used in two randomised trials in Uganda. Rasch analysis is a form of psychometric testing relying on Item Response Theory. It is a dynamic way of developing outcome measures that are valid and reliable. Objectives To assess the validity, reliability and responsiveness of 88 items addressing 22 key concepts using Rasch analysis. Participants We administrated four sets of multiple-choice items in English to 1114 people in Uganda and Norway, of which 685 were children and 429 were adults (including 171 health professionals). We scored all items dichotomously. We explored summary and individual fit statistics using the RUMM2030 analysis package. We used SPSS to perform distractor analysis. Results Most items conformed well to the Rasch model, but some items needed revision. Overall, the four item sets had satisfactory reliability. We did not identify significant response dependence between any pairs of items and, overall, the magnitude of multidimensionality in the data was acceptable. The items had a high level of difficulty. Conclusion Most of the items conformed well to the Rasch model’s expectations. Following revision of some items, we concluded that most of the items were suitable for use in an outcome measure for evaluating the ability of children or adults to assess treatment claims.


Evidence-based Medicine | 2018

Key Concepts for Informed Health Choices: a framework for helping people learn how to assess treatment claims and make informed choices

Iain Chalmers; Andrew D Oxman; Astrid Austvoll-Dahlgren; Selena Ryan-Vig; Sarah Pannell; Nelson Sewankambo; Daniel Semakula; Allen Nsangi; Loai Albarqouni; Paul Glasziou; Kamal R Mahtani; David Nunan; Carl Heneghan; Douglas Badenoch

Many claims about the effects of treatments, though well intentioned, are wrong. Indeed, they are sometimes deliberately misleading to serve interests other than the well-being of patients and the public. People need to know how to spot unreliable treatment claims so that they can protect themselves and others from harm. The ability to assess the trustworthiness of treatment claims is often lacking. Acquiring this ability depends on being familiar with, and correctly applying, some key concepts, for example, that’ association is not the same as causation.’ The Informed Health Choices (IHC) Project has identified 36 such concepts and shown that people can be taught to use them in decision making. A randomised trial in Uganda, for example, showed that primary school children with poor reading skills could be taught to apply 12 of the IHC Key Concepts. The list of IHC Key Concepts has proven to be effective in providing a framework for developing and evaluating IHC resources to help children to think critically about treatment claims. The list also provides a framework for retrieving, coding and organising other teaching and learning materials for learners of any age. It should help teachers, researchers, clinicians, and patients to structure critical thinking about the trustworthiness of claims about treatment effects.


Trials | 2017

Can an educational podcast improve the ability of parents of primary school children to assess the reliability of claims made about the benefits and harms of treatments: study protocol for a randomised controlled trial

Daniel Semakula; Allen Nsangi; Matthew Prescott Oxman; Astrid Austvoll-Dahlgren; Sarah Rosenbaum; Margaret Kaseje; Laetitia Nyirazinyoye; Atle Fretheim; Iain Chalmers; Andrew D Oxman; Nelson Sewankambo

BackgroundClaims made about the effects of treatments are very common in the media and in the population more generally. The ability of individuals to understand and assess such claims can affect their decisions and health outcomes. Many people in both low- and high-income countries have inadequate aptitude to assess information about the effects of treatments. As part of the Informed Healthcare Choices project, we have prepared a series of podcast episodes to help improve people’s ability to assess claims made about treatment effects. We will evaluate the effect of the Informed Healthcare Choices podcast on people’s ability to assess claims made about the benefits and harms of treatments. Our study population will be parents of primary school children in schools with limited educational and financial resources in Uganda.MethodsThis will be a two-arm, parallel-group, individual-randomised trial. We will randomly allocate consenting participants who meet the inclusion criteria for the trial to either listen to nine episodes of the Informed Healthcare Choices podcast (intervention) or to listen to nine typical public service announcements about health issues (control). Each podcast includes a story about a treatment claim, a message about one key concept that we believe is important for people to be able to understand to assess treatment claims, an explanation of how that concept applies to the claim, and a second example illustrating the concept.We designed the Claim Evaluation Tools to measure people’s ability to apply key concepts related to assessing claims made about the effects of treatments and making informed health care choices. The Claim Evaluation Tools that we will use include multiple-choice questions addressing each of the nine concepts covered by the podcast. Using the Claim Evaluation Tools, we will measure two primary outcomes: (1) the proportion that ‘pass’, based on an absolute standard and (2) the average score.DiscussionAs far as we are aware this is the first randomised trial to assess the use of mass media to promote understanding of the key concepts needed to judge claims made about the effects of treatments.Trial registrationPan African Clinical Trials Registry, PACTR201606001676150. Registered on 12 June 2016. http://www.pactr.org/ATMWeb/appmanager/atm/atmregistry?dar=true&tNo=PACTR201606001676150.


BMJ | 2017

Critical thinking in healthcare and education

Jonathan M Sharples; Andrew D Oxman; Kamal R Mahtani; Iain Chalmers; Sandy Oliver; Kevan Collins; Astrid Austvoll-Dahlgren; Tammy Hoffmann

Critical thinking is just one skill crucial to evidence based practice in healthcare and education, write Jonathan Sharples and colleagues, who see exciting opportunities for cross sector collaboration


Journal of Evidence-based Medicine | 2018

“A waste of time without patients”: The views of patient representatives attending a workshop in evidence‐based practice

Astrid Austvoll-Dahlgren; Marit Johansen

Shared decision‐making is a central element of evidence‐based practice (EBP). Training in EBP has traditionally focused on providers, but there is an increasing interest in developing such educational resources for patients. The aim of this study is to explore the views of patient representatives attending a workshop in EBP.

Collaboration


Dive into the Astrid Austvoll-Dahlgren's collaboration.

Top Co-Authors

Avatar

Andrew D Oxman

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gyri Synnøve Hval Straumann

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar

Sarah Rosenbaum

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Atle Fretheim

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar

Claire Glenton

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar

Gunn Elisabeth Vist

Norwegian Institute of Public Health

View shared research outputs
Top Co-Authors

Avatar

Marit Johansen

Norwegian Institute of Public Health

View shared research outputs
Researchain Logo
Decentralizing Knowledge