Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Stephanie M. Peterson is active.

Publication


Featured researches published by Stephanie M. Peterson.


Journal of Organizational Behavior Management | 2017

Evaluating the Temporal Location of Feedback: Providing Feedback Following Performance vs. Prior to Performance

Elian Aljadeff-Abergel; Stephanie M. Peterson; Rebecca R. Wiskirchen; Kristin K. Hagen; Mariah L. Cole

ABSTRACT In order to make feedback as effective as possible, it is important to understand its function in the three-term contingency (TTC) and the impact of various factors involved in delivering feedback. Timing of feedback is one factor that can affect the impact of feedback on learner’s behavior. An analysis of timing of feeback may help us getting closer to better understanding how feedback functions in the TTC. The purpose of this study was to evaluate the effects of feedback at different temporal locations. Specifically, feedback was provided either immediately (a) after a teaching session or (b) before the following teaching session, on teaching performance of undergraduate psychology students. The results indicated that feedback provided before the teaching session was more effective in improving teaching skills than feedback that was provided after the session. These findings suggest that feedback may function primarily as an antecedent to future performance and not necessarily as a consequence for past performance. However, the behavioral mechanism that explains these results is not yet clear. Future studies should investigate this further by manipulating the content of feedback prior to the teaching performance.


Archive | 2012

Ethical Issues and Considerations

Alan Poling; Jennifer L. Austin; Stephanie M. Peterson; Amanda Mahoney; Marc Weeden

Functional behavioral assessment (FBA) refers to a range of methods designed to identify the environmental variables that control problematic behaviors. Methods for collecting these data revealing these variables include indirect measures, such as interviews and questionnaires, or direct methods, such as narrative recording of the antecedents that precede responses of interest and the consequences that follow them. Many behavior analysts believe that the “gold standard” of FBA is experimental functional analysis (FA) (Iwata, Dorsey, Slifer, Bauman, & Richmond, 1982/1994), which systematically arranges consequences for problem behaviors to identify their functions, that is, the reinforcers that maintain those behaviors. FBA is one of several ways of collecting information about clients, and professional organizations such as the American Psychological Association (APA) and the Behavior Analyst Certification Board (BACB) have established general ethical guidelines regarding how assessments should be conducted and interpreted. For example, Standard 9 of the Ethical Principles of Psychologists and Code of Conduct promulgated by the APA (2010) is devoted entirely to assessment. The same is true of Standard 3.0 of the Behavior Analyst Certification Board Guidelines for Responsible Conduct (BACB Guidelines, BACB 2011). That standard is presented in Table 13.1. Any practitioner who abides with the standards established there and elsewhere in the Guidelines is therefore behaving ethically, regardless of whether he or she is involved in functional assessment or another professional activity.


Journal of Organizational Behavior Management | 2015

Assessing Observer Effects on the Fidelity of Implementation of Functional Analysis Procedures

Sean Field; Jessica E. Frieder; Heather M. McGee; Stephanie M. Peterson; Arielle Duinkerken

Instructing and training others to implement functional analysis (FA) procedures can be a cumbersome and time-consuming task. Students and practitioners are required to learn all of the various components to establish conditions and analyze results while also learning to conduct the sessions. The current study assessed the fidelity of individuals implementing FA conditions after observing and rating the fidelity of videos models, using a multiple-baseline design across FA conditions. Video models of each condition were provided throughout; however, participants were only asked to provide fidelity ratings for one video. Results demonstrated the intervention was successful in increasing participant performance above baseline levels for 16 of 17 participants, with 7 participants requiring an additional intervention in which they were asked to observe their own performance. Further research should evaluate the degree to which this procedure may effectively prepare individuals working with non-confederate children and carry over to other responses and settings.


Journal of the Experimental Analysis of Behavior | 2018

Assessing the repeatability of resurgence in humans: Implications for the use of within-subject designs: THE REPEATABILITY OF RESURGENCE

Kathryn M. Kestner; Claudia C. Diaz-Salvat; Claire C. St. Peter; Stephanie M. Peterson

Resurgence refers to the recurrence of a previously reinforced response following the worsening of reinforcement conditions (e.g., extinction) for an alternative response. Because of the implications for treatment relapse, researchers have become particularly interested in mitigating resurgence of human behavior. Some studies have employed reversal designs and varied parameters across replications (e.g., ABCADC) to compare effects of second-phase variables. Although resurgence is generally repeatable within and between subjects, the extent to which similar levels of resurgence occur across replications is less clear. To assess the repeatability of resurgence, we conducted a secondary analysis of 62 human-operant data sets using ABCABC reversal designs from two laboratories in the United States. We found significant reductions in the magnitude of resurgence during the second exposure to extinction relative to the first exposure when all other phase variables were held constant. These results suggest that researchers should exercise caution when using within-subject, across-phase replications to compare resurgence between variable manipulations with human participants.


Behavior analysis in practice | 2018

Using Microsoft Excel® to Build a Customized Partial-Interval Data Collection System

Cody A. Morris; Neil Deochand; Stephanie M. Peterson

Using data to inform treatment decisions is a hallmark of behavior analysis. However, collecting the type of data that behavior analysts often require can be a labor-intensive and time-consuming task. Electronic data collection systems have been identified as a tool to alleviate some of the issues related to data collection, but many obstacles still exist. Current limitations of electronic data collection systems include cost, adaptability, ease of use, and compliance with privacy and security guidelines. The purpose of this article is to offer practitioners an alternative to buying an electronic data collection system by providing a task analysis on how to build customized electronic data collection systems using Microsoft Excel®. This task analysis is written for individuals with limited or no experience working with Excel® but may also be of utility to individuals fluent in Excel®. This task analysis is organized into three sections: (a) creating a basic electronic data collection table with dropdown menus and autofill features, (b) creating a timestamp for all data entered, and (c) creating automatically graphing displays of data.


Behavior Analysis: Research and Practice | 2017

Functional analysis: A need for clinical decision support tools to weigh risks and benefits.

Rebecca R. Wiskirchen; Neil Deochand; Stephanie M. Peterson

Appropriate safety precautions should be used when conducting a functional analysis to protect the client and for ethical practice. There are few, if any, explicit protocols offered to behavioral practitioners that consolidate the safety recommendations documented in the literature. Furthermore, beyond collecting a list of safety considerations, guidelines as to when to use these considerations are also lacking. What is needed is a more formal risk-benefit assessment procedure for practitioners. The term risk assessment can be used to describe both the safety considerations, and the decision-making process to ensure that a functional analysis is conducted in a safe, ethical, and empirically sound manner. We define the risk assessment, clarify the need to formalize this assessment, and discuss 4 specific domains that should be included in a risk assessment to identify potential risk to clients.


Journal of Developmental and Physical Disabilities | 2013

Behavioral Treatment of Bedwetting in an Adolescent with Autism

Nicole Henriksen; Stephanie M. Peterson


Research in Autism Spectrum Disorders | 2015

The effectiveness of self-management interventions for children with autism—A literature review

Elian Aljadeff-Abergel; Yannick A. Schenk; Christopher Walmsley; Stephanie M. Peterson; Jessica E. Frieder; Nicholas Acker


Behavior Analyst | 2018

A Review of SAFMEDS: Evidence for Procedures, Outcomes and Directions for Future Research

Shawn P. Quigley; Stephanie M. Peterson; Jessica E. Frieder; Kimberly M. Peck


Behavior Analysis: Research and Practice | 2018

Best practices and considerations for effective service provision via remote technology.

Denice Rios; Ellie Kazemi; Stephanie M. Peterson

Collaboration


Dive into the Stephanie M. Peterson's collaboration.

Top Co-Authors

Avatar

Jessica E. Frieder

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Neil Deochand

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yannick A. Schenk

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar

Alan Poling

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar

Amanda Mahoney

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar

Arielle Duinkerken

Western Michigan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge