Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian K. Bumbarger is active.

Publication


Featured researches published by Brian K. Bumbarger.


Journal of Children's Services | 2008

After randomised trials: issues related to dissemination of evidence‐based interventions

Brian K. Bumbarger; Daniel F. Perkins

Demonstrating the efficacy and effectiveness of prevention programmes in rigorous randomised trials is only the beginning of a process that may lead to better public health outcomes. Although a growing number of programmes have been shown to be effective at reducing drug use and delinquency among young people under carefully controlled conditions, we are now faced with a new set of obstacles. First, these evidence‐based programmes are still under‐utilised compared to prevention strategies with no empirical support. Second, when effective programmes are used the evidence suggests they are not being implemented with quality and fidelity. Third, effective programmes are often initiated with short‐term grant funding, creating a challenge for sustainability beyond seed funding. We discuss each of these challenges, and present lessons learned from a large‐scale dissemination effort involving over 140 evidence‐based programme replications in one state in the US.


The Journal of Primary Prevention | 2013

Examining Adaptations of Evidence-Based Programs in Natural Contexts

Julia E. Moore; Brian K. Bumbarger; Brittany Rhoades Cooper

When evidence-based programs (EBPs) are scaled up in natural, or non-research, settings, adaptations are commonly made. Given the fidelity-versus-adaptation debate, theoretical rationales have been provided for the pros and cons of adaptations. Yet the basis of this debate is theoretical; thus, empirical evidence is needed to understand the types of adaptations made in natural settings. In the present study, we introduce a taxonomy for understanding adaptations. This taxonomy addresses several aspects of adaptations made to programs including the fit (philosophical or logistical), timing (proactive or reactive), and valence, or the degree to which the adaptations align with the program’s goals and theory, (positive, negative, or neutral). Self-reported qualitative data from communities delivering one of ten state-funded EBPs were coded based on the taxonomy constructs; additionally, quantitative data were used to examine the types and reasons for making adaptations under natural conditions. Forty-four percent of respondents reported making adaptations. Adaptations to the procedures, dosage, and content were cited most often. Lack of time, limited resources, and difficulty retaining participants were listed as the most common reasons for making adaptations. Most adaptations were made reactively, as a result of issues of logistical fit, and were not aligned with, or deviated from, the program’s goals and theory.


Prevention Science | 2010

Sustaining Evidence-based Interventions Under Real-world Conditions: Results from a Large-scale Diffusion Project

Melissa Tibbits; Brian K. Bumbarger; Sandee J. Kyler; Daniel F. Perkins

This study examined factors associated with the predicted and actual post-funding sustainability of evidence-based interventions implemented as part of the Pennsylvania Commission on Crime and Delinquency’s Research-Based Delinquency and Violence Prevention Initiative. Correlates of predicted post-funding sustainability included program staff, overall school support, and school administrator support. Additionally, predicted post-funding sustainability was strongly associated with actual post-funding sustainability. Other correlates of actual post-funding sustainability included financial sustainability planning and aligning the intervention with the goals of the agency/school. Five years post-funding 33% of the interventions were no longer operating, 22% were operating at a reduced level, and 45% were operating at the same level or a higher level than the final year of funding. These findings are discussed in terms of implications for increasing intervention sustainability, as well as implications for future research on intervention sustainability.


American Journal of Community Psychology | 2012

The Role of a State-Level Prevention Support System in Promoting High-Quality Implementation and Sustainability of Evidence-Based Programs

Brittany L. Rhoades; Brian K. Bumbarger; Julia E. Moore

Although numerous evidence-based programs (EBPs) have been proven effective in research trials and are being widely promoted through federal, state, and philanthropic dollars, few have been “scaled up” in a manner likely to have a measurable impact on today’s critical social problems. The Interactive Systems Framework for Dissemination and Implementation (ISF) explicates three systems that are critical in addressing the barriers that prevent these programs from having their intended public health impact. In this article we describe the relevance of these systems in a real-world context with a specific focus on the Prevention Support System (PSS). We expand on the ISF model by presenting funders and policy-makers as active and engaged stakeholders, and demonstrate how a state-level PSS has used empirical evidence to inform general and program-specific capacity-building and support interactions among researchers, funders, and practitioners in Pennsylvania. By embracing this expanded ISF framework as a conceptual model for the wide-scale dissemination and support of EBPs, and recognizing the need for a distinct state-level PSS, Pennsylvania has created an infrastructure to effectively address the primary barriers to moving from lists of EBPs to achieving population-level public health improvement.


Administration and Policy in Mental Health | 2012

A State Agency-University Partnership for Translational Research and the Dissemination of Evidence-Based Prevention and Intervention

Brian K. Bumbarger; Elizabeth Morey Campbell

This article describes a decade-long partnership between the Prevention Research Center at Penn State and the Pennsylvania Commission on Crime and Delinquency. This partnership has evolved into a multi-agency initiative supporting the implementation of nearly 200 replications of evidence-based prevention and intervention programs, and a series of studies indicating a significant and sustained impact on youth outcomes and more efficient utilization of system resources. We describe how the collaboration has developed into a sophisticated prevention support infrastructure, discuss the partnership and policy lessons learned throughout this journey, and identify remaining issues in promoting this type of research–policy partnership.


Evaluation and Program Planning | 2015

Achieving successful evidence-based practice implementation in juvenile justice: the importance of diagnostic and evaluative capacity

Sarah Cusworth Walker; Brian K. Bumbarger; Stephen Phillippi

Evidence-based programs (EBPs) are an increasingly visible aspect of the treatment landscape in juvenile justice. Research demonstrates that such programs yield positive returns on investment and are replacing more expensive, less effective options. However, programs are unlikely to produce expected benefits when they are not well-matched to community needs, not sustained and do not reach sufficient reach and scale. We argue that achieving these benchmarks for successful implementation will require states and county governments to invest in data-driven decision infrastructure in order to respond in a rigorous and flexible way to shifting political and funding climates. We conceptualize this infrastructure as diagnostic capacity and evaluative capacity: Diagnostic capacity is defined as the process of selecting appropriate programing and evaluative capacity is defined as the ability to monitor and evaluate progress. Policy analyses of Washington State, Pennsylvania and Louisianas program implementation successes are used to illustrate the benefits of diagnostic and evaluate capacity as a critical element of EBP implementation.


Prevention Science | 2015

Readiness Assessment to Improve Program Implementation: Shifting the Lens to Optimizing Intervention Design.

Brian K. Bumbarger

The focus of this special issue is on identifying factors that predict a school’s likelihood of high implementation of social– emotional learning (SEL) interventions. In this case, these factors are interpreted as readiness to implement. Implementation readiness is defined by the editors as the capacity to implement an evidence-based intervention (EBI) effectively. Though not stated explicitly, readiness in this definition seems to be a characteristic of the implementers (i.e., teachers or school). The model proposed by the editors and reflected to varying degrees by each of the seven studies involves documenting variables of teachers, classrooms, and schools that are predictive of high-quality implementation and using those variables to create readiness profiles and tailor implementation supports. SEL programs have demonstrated convincing efficacy for improving the social and academic development of children (Durlak et al. 2011). As practitioners and policy makers become more convinced of the fundamental importance of SEL as a foundation for quality education and child development, the challenge of effectively scaling SEL programs and practices is becoming more critical and timely. This special issue addresses an important empirical question: Can we identify factors that represent Breadiness^ of a school (and its teachers and classrooms) to adopt an SEL program and deliver it with sufficient quality and fidelity to reproduce the improvements in social and academic outcomes demonstrated in controlled trials? The seven SEL implementation studies presented in this special issue depict a complex picture of delivering SEL programs in schools and assessing both implementation and impact. As a result, across these seven studies, and other similar studies, we arrive at a long laundry list of variables that may influence implementation quality, fidelity, and reach (which may in turn potentially impact program effects and sustainment). So what can be made of the complex readiness model characterized across these seven studies? The collective body of SEL implementation research, exemplified in the articles of this special issue and more broadly, addresses both academic (i.e., for the sake of increasing our generalizable knowledge) and utilitarian (i.e., for the practical advancement of the scaling of SEL practice in schools) ends. Though these are sometimes overlapping goals simultaneously advanced, for the sake of clarity and space, this commentary will primarily address the practical and pragmatic value and lessons of this special issue, with admittedly less attention paid to issues of methodology, analytic techniques, or study designs. Considering the lessons we can draw across these seven SEL implementation studies, we might start with the end in mind: what could we do if we arrived at a clear list of the most important predictors of high(or low-)quality implementation? It is a herculean task to elucidate the characteristics of social–emotional development and subsequently use that knowledge to craft an intervention intended to promote such development. To further demonstrate, in the context of a rigorous experimental trial, that such an intervention can produce (both statistically and practically) significant improvements relative to a control condition is equally challenging, and not accomplished without a commitment to sound theory and a solid understanding of the school and classroom context (Flay et al. 2005). So the achievement of each of these programs in demonstrating efficacy must be recognized. Taking such * Brian K. Bumbarger [email protected]


Archive | 2017

Sustaining Crime Prevention at Scale: Transforming Delivery Systems Through Prevention Science

Ross Homel; Brian K. Bumbarger; Kate Freiberg; Sara Branch

In this chapter, we argue that to achieve sustained reductions in crime, violence, or injuries on a large scale, it is necessary to transform prevention delivery systems so that they conform in their practices, on a continuing basis, with scientific evidence. We explain and defend our proposition, drawing on examples and case studies from our own research and that of others. Although we highlight two very successful innovations (random breath testing in Australia and the Evidence-based Prevention and Intervention Support Center (EPISCenter) in Pennsylvania), we take the view that failures are as instructive as successes and include some brief examples of the former in our discussion. We also touch upon some of the lessons about Type 2 Translation from crime prevention initiatives that have been designed and implemented on the basis of contrasting theoretical models and empirical methods, including criminal justice approaches, situational initiatives, and community-based developmental interventions.


Prevention & Treatment | 2001

The Prevention of Mental Disorders in School-Aged Children: Current State of the Field

Mark T. Greenberg; Celene E. Domitrovich; Brian K. Bumbarger


Journal of Community Psychology | 2008

How do implementation efforts relate to program adherence? examining the role of organizational, implementer, and program factors

Jacinda K. Dariotis; Brian K. Bumbarger; Larissa G. Duncan; Mark T. Greenberg

Collaboration


Dive into the Brian K. Bumbarger's collaboration.

Top Co-Authors

Avatar

Mark T. Greenberg

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Sandee J. Kyler

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel F. Perkins

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Brittany L. Rhoades

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Celene Domitrovich

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Celene E. Domitrovich

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Damon E. Jones

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge