Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lisa M. Hagermoser Sanetti is active.

Publication


Featured researches published by Lisa M. Hagermoser Sanetti.


School Psychology Quarterly | 2013

Implementation Science and School Psychology

Susan G. Forman; Edward S. Shapiro; Robin S. Codding; Jorge E. Gonzales; Linda A. Reddy; Sylvia Rosenfield; Lisa M. Hagermoser Sanetti; Karen Callan Stoiber

The APA Division 16 Working Group on Translating Science to Practice contends that implementation science is essential to the process of translating evidence-based interventions (EBIs) into the unique context of the schools, and that increasing attention to implementation will lead to the improvement of school psychological services and school learning environments. Key elements of implementation and implementation science are described. Four critical issues for implementation science in school psychology are presented: barriers to implementation, improving intervention fidelity and identifying core intervention components, implementation with diverse client populations, and implementation in diverse settings. What is known and what researchers need to investigate for each set of issues is addressed. A discussion of implementation science methods and measures is included. Finally, implications for research, training and practice are presented.


Exceptional Children | 2015

Is Performance Feedback for Educators an Evidence-Based Practice? A Systematic Review and Evaluation Based on Single-Case Research:

Lindsay M. Fallon; Melissa A. Collier-Meek; Daniel M. Maggin; Lisa M. Hagermoser Sanetti; Austin H. Johnson

Optimal levels of treatment fidelity, a critical moderator of intervention effectiveness, are often difficult to sustain in applied settings. It is unknown whether performance feedback, a widely researched method for increasing educators’ treatment fidelity, is an evidence-based practice. The purpose of this review was to evaluate the current research on performance feedback as a strategy to promote the implementation of school-based practices. Studies were evaluated according to What Works Clearinghouse (WWC; Kratochwill et al., 2010) technical guidelines for single-case design, utilizing both the design and evidence standards to determine whether studies provided sufficient evidence for the effectiveness of performance feedback. Results indicate that performance feedback can be termed an evidence-based intervention based on criteria set by the WWC. Implications for future research are described.


Journal of Educational and Psychological Consultation | 2011

Treatment Integrity Assessment: How Estimates of Adherence, Quality, and Exposure Influence Interpretation of Implementation

Lisa M. Hagermoser Sanetti; Lindsay M. Fallon

The assessment of treatment integrity is important for determining the effectiveness of an intervention. Treatment integrity, a multidimensional construct, can be evaluated in two ways: by session or component. In this study, adherence, quality, and exposure data are estimated using permanent product data from implementation of the PAX Good Behavior Game to demonstrate how varied assessment of treatment integrity can influence interpretation of implementation. Implications for implementers and school-based decisions are discussed.


Journal of Positive Behavior Interventions | 2012

Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions From 1999 to 2009

Lisa M. Hagermoser Sanetti; Lisa M. Dobey; Katie L. Gritter

For more than 10 years, the Journal of Positive Behavior Interventions has published, among other types of articles, behavioral intervention outcome studies related to positive behavior support. Operationally defining interventions is important to facilitating replication studies and adoption of intervention in applied settings. Furthermore, treatment integrity data are necessary to make valid claims that changes in outcomes resulted from intervention implementation and are thus essential to the internal validity of intervention outcome research. Reviews of treatment outcome research in related fields (e.g., applied behavior analysis) indicate that although many researchers operationally define interventions, a majority of researchers fail to report treatment integrity data. The purpose of this study was to review the treatment integrity data reported in all experimental intervention studies published in the Journal of Positive Behavior Interventions between 1999 and 2009. Results indicate that in recent years, a majority of published studies include a definition of the independent variable but do not provide quantitative treatment integrity data.


Assessment for Effective Intervention | 2012

Barriers to Implementing Treatment Integrity Procedures in School Psychology Research: Survey of Treatment Outcome Researchers.

Lisa M. Hagermoser Sanetti; Florence D. DiGennaro Reed

Treatment integrity data are essential to drawing valid conclusions in treatment outcome studies. Such data, however, are not always included in peer-reviewed research articles in school psychology or related fields. To gain a better understanding of why treatment integrity data are lacking in the school psychology research, we surveyed the authors of the 210 treatment outcome articles published in four school psychology journals from 1995 through 2008 regarding their perceptions of barriers to implementing treatment integrity procedures. Results indicated that (a) lack of theory and specific guidelines on treatment integrity procedures; (b) lack of general knowledge about treatment integrity; (c) time, cost, and labor demands; and (d) lack of editorial requirement were broadly perceived as barriers by school psychology researchers to implementing treatment integrity procedures. Implications for future research are discussed.


Assessment for Effective Intervention | 2009

Extending Use of Direct Behavior Rating beyond Student Assessment: Applications to Treatment Integrity Assessment within a Multi-Tiered Model of School-Based Intervention Delivery.

Lisa M. Hagermoser Sanetti; Sandra M. Chafouleas; Theodore J. Christ; Katie L. Gritter

To make valid decisions about intervention effectiveness in a tiered intervention system, it is essential to formatively assess treatment integrity along with student outcomes. Despite significant advances in technologies for ongoing assessment of student outcomes, research regarding treatment integrity assessment has not shared the same progress in that most available methods lack adequate psychometric evidence and require significant resources. Direct behavior rating has been discussed and evaluated as a tool for assessing student behavior, yet the technology could be extended to result in an efficient treatment integrity assessment method with utility in a multi-tiered model of school-based intervention delivery. This article reviews options in treatment integrity assessment, with emphasis on how direct behavior rating technology might be incorporated within a multi-tiered model of intervention delivery. Implications for both practice and research are addressed.


Education and Treatment of Children | 2014

School-Wide Positive Behavior Support (SWPBS) in the Classroom: Assessing Perceived Challenges to Consistent Implementation in Connecticut Schools.

Lindsay M. Fallon; Scott R. McCarthy; Lisa M. Hagermoser Sanetti

The number of schools implementing school-wide positive behavior support (SWPBS) practices nationwide is increasing, but still little is known about the fidelity with which teachers implement SWPBS practices in the classroom. Specifically, data are needed that reflect the consistency with which classroom-based SWPBS practices are implemented, as well as challenges to implementation faced by school personnel, to ensure the best possible behavioral and academic outcomes for students. In this study, personnel in Connecticut schools implementing SWPBS (N = 171) were surveyed, and results indicate that although classroom-based SWPBS practices are implemented very consistently by the majority of respondents, certain practices are somewhat challenging to implement. Implications for improving practice and training are offered.


Journal of School Psychology | 2016

An exploratory investigation of teachers' intervention planning and perceived implementation barriers.

Anna C. J. Long; Lisa M. Hagermoser Sanetti; Melissa A. Collier-Meek; Jennifer Gallucci; Margaret Altschaefl; Thomas R. Kratochwill

Increasingly teachers are the primary implementer responsible for providing evidence-based interventions to students. However, there is little knowledge regarding the extent to which teachers plan for intervention implementation, receive implementation support, or identify and address implementation barriers. This study explores survey data from over 1200 preschool through grade 12 teachers from 46 public school districts in a Northeastern state. Results indicate that teachers spend significant time engaging in intervention-related behavior and may be a primary source responsible for selecting student interventions. However, the current extent to which they plan for implementation and present levels of implementation support are inadequate to produce high levels of sustained intervention implementation. In addition, almost 60% of implementation barriers reported related to aspects of the intervention itself. Findings from this study provide guidance for future research and preliminary recommendations for ameliorating implementation barriers and proactively supporting treatment integrity in schools.


Behavioral Disorders | 2012

Training Paraeducators to Implement a Group Contingency Protocol: Direct and Collateral Effects

Daniel M. Maggin; Lindsay M. Fallon; Lisa M. Hagermoser Sanetti; Laura M. Ruberto

The present study investigated the effects of an intensive training protocol on levels of paraeducator fidelity to a group contingency intervention used to manage the classroom behavior of students with EBD. A multiple baseline design across classrooms was used to determine whether the training was associated with initial and sustained increases in treatment fidelity. Data were also collected on the effects of paraeducator use of the group contingency program on rates of paraeducator, teacher, and student behavior. Results indicated that the training package was associated with immediate increases in paraeducator fidelity, which were subsequently sustained following the removal of systematic performance feedback on paraeducator adherence to the protocol. The implementation of the group contingency program by paraeducators also led to increases in the rates of interactions between paraeducators and students, increases in the rates of teacher instruction, and decreases in the rates of aggressive behavior by students. Findings of the study are discussed within the context of developing effective training methods for paraeducators working alongside students with EBD.


Journal of Educational and Psychological Consultation | 2014

Assessment of Consultation and Intervention Implementation: A Review of Conjoint Behavioral Consultation Studies

Melissa A. Collier-Meek; Lisa M. Hagermoser Sanetti

Reviews of treatment outcome literature indicate treatment integrity is not regularly assessed. In consultation, two levels of treatment integrity (i.e., consultant procedural integrity [CPI] and intervention treatment integrity [ITI]) provide relevant implementation data. Specifically, assessment of CPI and ITI are necessary to conclude (a) consultation is functionally related to consultee implementation behavior and (b) intervention implementation is functionally related to student outcomes. In this article, study characteristics and the presence of treatment integrity at both levels are examined in 21 studies utilizing Conjoint Behavioral Consultation, a model of consultation that includes multiple consultees. Results indicate that in approximately half of studies, CPI, ITI, or both are assessed and, when reported, treatment integrity is moderately high across both levels. However, there are distinct differences in the assessment and reporting of these levels of treatment integrity. Limitations and implications for consultation research and treatment integrity reporting are discussed.

Collaboration


Dive into the Lisa M. Hagermoser Sanetti's collaboration.

Top Co-Authors

Avatar

Melissa A. Collier-Meek

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Lindsay M. Fallon

Bridgewater State University

View shared research outputs
Top Co-Authors

Avatar

Thomas R. Kratochwill

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anna C. J. Long

Louisiana State University

View shared research outputs
Top Co-Authors

Avatar

Daniel M. Maggin

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge