Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David A. Chambers is active.

Publication


Featured researches published by David A. Chambers.


Administration and Policy in Mental Health | 2009

Implementation Research in Mental Health Services: an Emerging Science with Conceptual, Methodological, and Training challenges

Enola K. Proctor; John Landsverk; Gregory A. Aarons; David A. Chambers; Charles Glisson; Brian S. Mittman

One of the most critical issues in mental health services research is the gap between what is known about effective treatment and what is provided to consumers in routine care. Concerted efforts are required to advance implementation science and produce skilled implementation researchers. This paper seeks to advance implementation science in mental health services by over viewing the emergence of implementation as an issue for research, by addressing key issues of language and conceptualization, by presenting a heuristic skeleton model for the study of implementation processes, and by identifying the implications for research and training in this emerging field.


American Journal of Preventive Medicine | 2012

Bridging Research and Practice: Models for Dissemination and Implementation Research

Rachel G. Tabak; Elaine C. Khoong; David A. Chambers; Ross C. Brownson

CONTEXT Theories and frameworks (hereafter called models) enhance dissemination and implementation (D&I) research by making the spread of evidence-based interventions more likely. This work organizes and synthesizes these models by (1) developing an inventory of models used in D&I research; (2) synthesizing this information; and (3) providing guidance on how to select a model to inform study design and execution. EVIDENCE ACQUISITION This review began with commonly cited models and model developers and used snowball sampling to collect models developed in any year from journal articles, presentations, and books. All models were analyzed and categorized in 2011 based on three author-defined variables: construct flexibility, focus on dissemination and/or implementation activities (D/I), and the socioecologic framework (SEF) level. Five-point scales were used to rate construct flexibility from broad to operational and D/I activities from dissemination-focused to implementation-focused. All SEF levels (system, community, organization, and individual) applicable to a model were also extracted. Models that addressed policy activities were noted. EVIDENCE SYNTHESIS Sixty-one models were included in this review. Each of the five categories in the construct flexibility and D/I scales had at least four models. Models were distributed across all levels of the SEF; the fewest models (n=8) addressed policy activities. To assist researchers in selecting and utilizing a model throughout the research process, the authors present and explain examples of how models have been used. CONCLUSIONS These findings may enable researchers to better identify and select models to inform their D&I work.


American Journal of Public Health | 2012

National Institutes of Health Approaches to Dissemination and Implementation Science: Current and Future Directions

Russell E. Glasgow; Cynthia Vinson; David A. Chambers; Muin J. Khoury; Robert M. Kaplan; Christine Hunter

To address the vast gap between current knowledge and practice in the area of dissemination and implementation research, we address terminology, provide examples of successful applications of this research, discuss key sources of support, and highlight directions and opportunities for future advances. There is a need for research testing approaches to scaling up and sustaining effective interventions, and we propose that further advances in the field will be achieved by focusing dissemination and implementation research on 5 core values: rigor and relevance, efficiency, collaboration, improved capacity, and cumulative knowledge.


Clinical and Translational Science | 2012

Developing Robust, Sustainable, Implementation Systems Using Rigorous, Rapid and Relevant Science

Russell E. Glasgow; David A. Chambers

Background: Current approaches to medical science generally have not resulted in rapid, robust integration into feasible, sustainable real world healthcare programs and policies. Implementation science risks falling short of expectations if it aligns with historical norms. Fundamentally different scientific approaches are needed to accelerate such integration.


Implementation Science | 2013

The U.S. training institute for dissemination and implementation research in health

Helen I. Meissner; Russell E. Glasgow; Cynthia Vinson; David A. Chambers; Ross C. Brownson; Lawrence W. Green; Alice S. Ammerman; Bryan J. Weiner; Brian S. Mittman

BackgroundThe science of dissemination and implementation (D&I) is advancing the knowledge base for how best to integrate evidence-based interventions within clinical and community settings and how to recast the nature or conduct of the research itself to make it more relevant and actionable in those settings. While the field is growing, there are only a few training programs for D&I research; this is an important avenue to help build the field’s capacity. To improve the United States’ capacity for D&I research, the National Institutes of Health and Veterans Health Administration collaborated to develop a five-day training institute for postdoctoral level applicants aspiring to advance this science.MethodsWe describe the background, goals, structure, curriculum, application process, trainee evaluation, and future plans for the Training in Dissemination and Implementation Research in Health (TIDIRH).ResultsThe TIDIRH used a five-day residential immersion to maximize opportunities for trainees and faculty to interact. The train-the-trainer-like approach was intended to equip participants with materials that they could readily take back to their home institutions to increase interest and further investment in D&I. The TIDIRH curriculum included a balance of structured large group discussions and interactive small group sessions.Thirty-five of 266 applicants for the first annual training institute were accepted from a variety of disciplines, including psychology (12 trainees); medicine (6 trainees); epidemiology (5 trainees); health behavior/health education (4 trainees); and 1 trainee each from education & human development, health policy and management, health services research, public health studies, public policy and social work, with a maximum of two individuals from any one institution. The institute was rated as very helpful by attendees, and by six months after the institute, a follow-up survey (97% return rate) revealed that 72% had initiated a new grant proposal in D&I research; 28% had received funding, and 77% had used skills from TIDIRH to influence their peers from different disciplines about D&I research through building local research networks, organizing formal presentations and symposia, teaching and by leading interdisciplinary teams to conduct D&I research.ConclusionsThe initial TIDIRH training was judged successful by trainee evaluation at the conclusion of the week’s training and six-month follow-up, and plans are to continue and possibly expand the TIDIRH in coming years. Strengths are seen as the residential format, quality of the faculty and their flexibility in adjusting content to meet trainee needs, and the highlighting of concrete D&I examples by the local host institution, which rotates annually. Lessons learned and plans for future TIDIRH trainings are summarized.


JAMA | 2016

Convergence of Implementation Science, Precision Medicine, and the Learning Health Care System: A New Model for Biomedical Research

David A. Chambers; W. Gregory Feero; Muin J. Khoury

This Viewpoint discusses the integration of precision medicine discoveries with the learning health care system via implementation science.


Clinical and Translational Science | 2014

Big Data and Large Sample Size: A Cautionary Note on the Potential for Bias

Robert M. Kaplan; David A. Chambers; Russell E. Glasgow

A number of commentaries have suggested that large studies are more reliable than smaller studies and there is a growing interest in the analysis of “big data” that integrates information from many thousands of persons and/or different data sources. We consider a variety of biases that are likely in the era of big data, including sampling error, measurement error, multiple comparisons errors, aggregation error, and errors associated with the systematic exclusion of information. Using examples from epidemiology, health services research, studies on determinants of health, and clinical trials, we conclude that it is necessary to exercise greater caution to be sure that big sample size does not lead to big inferential errors. Despite the advantages of big studies, large sample size can magnify the bias associated with error resulting from sampling or study design. Clin Trans Sci 2014; Volume #: 1–5


Implementation Science | 2013

Designing a valid randomized pragmatic primary care implementation trial: The my own health report (MOHR) project

Alex H. Krist; Beth A. Glenn; Russell E. Glasgow; Bijal A. Balasubramanian; David A. Chambers; Maria E. Fernandez; Suzanne Heurtin-Roberts; Rodger Kessler; Marcia G. Ory; Siobhan M. Phillips; Debra P. Ritzwoller; Dylan H. Roby; Hector P. Rodriguez; Roy T. Sabo; Sherri Sheinfeld Gorin; Kurt C. Stange

BackgroundThere is a pressing need for greater attention to patient-centered health behavior and psychosocial issues in primary care, and for practical tools, study designs and results of clinical and policy relevance. Our goal is to design a scientifically rigorous and valid pragmatic trial to test whether primary care practices can systematically implement the collection of patient-reported information and provide patients needed advice, goal setting, and counseling in response.MethodsThis manuscript reports on the iterative design of the My Own Health Report (MOHR) study, a cluster randomized delayed intervention trial. Nine pairs of diverse primary care practices will be randomized to early or delayed intervention four months later. The intervention consists of fielding the MOHR assessment – addresses 10 domains of health behaviors and psychosocial issues – and subsequent provision of needed counseling and support for patients presenting for wellness or chronic care. As a pragmatic participatory trial, stakeholder groups including practice partners and patients have been engaged throughout the study design to account for local resources and characteristics. Participatory tasks include identifying MOHR assessment content, refining the study design, providing input on outcomes measures, and designing the implementation workflow. Study outcomes include the intervention reach (percent of patients offered and completing the MOHR assessment), effectiveness (patients reporting being asked about topics, setting change goals, and receiving assistance in early versus delayed intervention practices), contextual factors influencing outcomes, and intervention costs.DiscussionThe MOHR study shows how a participatory design can be used to promote the consistent collection and use of patient-reported health behavior and psychosocial assessments in a broad range of primary care settings. While pragmatic in nature, the study design will allow valid comparisons to answer the posed research question, and findings will be broadly generalizable to a range of primary care settings. Per the pragmatic explanatory continuum indicator summary (PRECIS) framework, the study design is substantially more pragmatic than other published trials. The methods and findings should be of interest to researchers, practitioners, and policy makers attempting to make healthcare more patient-centered and relevant.Trial registrationClinicaltrials.gov: NCT01825746


Psychiatric Services | 2013

Research and Services Partnerships: Partnership: A Fundamental Component of Dissemination and Implementation Research

David A. Chambers; Susan T. Azrin

This column describes the essential role of partnerships in the conduct of dissemination and implementation (D&I) research. This research field, which develops knowledge to support the integration of health information and evidence-based practices, has thrived in recent years through research initiatives by federal agencies, states, foundations, and other funders. The authors describe three ongoing studies anchored in research partnerships to improve the implementation of effective practices within various service systems. Inherent in the challenge of introducing evidence-based practices in clinical and community settings is the participation of a wide range of stakeholders who may influence D&I efforts. Opportunities to enhance partnerships in D&I research are described, specifically in light of recent initiatives led by the National Institutes of Health. Partnerships remain a crucial component of successful D&I research. The future of the field depends on the ability to utilize partnerships to conduct more rigorous and robust research.


Implementation Science | 2013

The implementation research institute: training mental health implementation researchers in the United States.

Enola K. Proctor; John Landsverk; Ana A. Baumann; Brian S. Mittman; Gregory A. Aarons; Ross C. Brownson; Charles Glisson; David A. Chambers

BackgroundThe Implementation Research Institute (IRI) provides two years of training in mental health implementation science for 10 new fellows each year. The IRI is supported by a National Institute of Mental Health (NIMH) R25 grant and the Department of Veterans Affairs (VA). Fellows attend two annual week-long trainings at Washington University in St. Louis. Training is provided through a rigorous curriculum, local and national mentoring, a ‘learning site visit’ to a federally funded implementation research project, pilot research, and grant writing.MethodsThis paper describes the rationale, components, outcomes to date, and participant experiences with IRI.ResultsIRI outcomes include 31 newly trained implementation researchers, their new grant proposals, contributions to other national dissemination and implementation research training, and publications in implementation science authored by the Core Faculty and fellows. Former fellows have obtained independent research funding in implementation science and are beginning to serve as mentors for more junior investigators.ConclusionsBased on the number of implementation research grant proposals and papers produced by fellows to date, the IRI is proving successful in preparing new researchers who can inform the process of making evidence-based mental healthcare more available through real-world settings of care and who are advancing the field of implementation science.

Collaboration


Dive into the David A. Chambers's collaboration.

Top Co-Authors

Avatar

Ross C. Brownson

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Russell E. Glasgow

University of Colorado Denver

View shared research outputs
Top Co-Authors

Avatar

Enola K. Proctor

Washington University in St. Louis

View shared research outputs
Top Co-Authors

Avatar

Cynthia Vinson

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Gila Neta

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

Muin J. Khoury

Office of Public Health Genomics

View shared research outputs
Top Co-Authors

Avatar

Wynne E. Norton

University of Alabama at Birmingham

View shared research outputs
Top Co-Authors

Avatar

Robert M. Kaplan

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kurt C. Stange

Case Western Reserve University

View shared research outputs
Researchain Logo
Decentralizing Knowledge