Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joseph A. Taylor is active.

Publication


Featured researches published by Joseph A. Taylor.


International Journal of Science Education | 2017

Teacher pedagogical content knowledge, practice, and student achievement†

Julie Gess-Newsome; Joseph A. Taylor; Janet Carlson; April L. Gardner; Christopher D. Wilson; Molly Stuhlsatz

ABSTRACT In this exploratory study, we attempted to measure potential changes in teacher knowledge and practice as a result of an intervention, as well as trace such changes through a theoretical path of influence that could inform a model of teacher professional knowledge. We created an instrument to measure pedagogical content knowledge (PCK), studied the impact of a two-year professional development intervention, explored the relationships among teacher variables to attempt to validate a model of teacher professional knowledge, and examined the relationship of teacher professional knowledge and classroom practice on student achievement. Teacher professional knowledge and skill was measured in terms of academic content knowledge (ACK), general pedagogical knowledge (GenPK), PCK and teacher practice. Our PCK instrument identified two factors within PCK: PCK-content knowledge and PCK-pedagogical knowledge. Teacher gains existed for all variables. Only GenPK had a significant relationship to teacher practice. ACK was the only variable that explained a substantial portion of student achievement. Our findings provide empirical evidence that we interpret through the lens of the model of teacher professional knowledge and skill, including PCK [Gess-Newsome, J. (2015). A model of teacher professional knowledge and skill including PCK: Results of the thinking from the PCK summit. In A. Berry, P. Friedrichsen, & J. Loughran (Eds.), Re-examining pedagogical content knowledge in science education (pp. 28–42). London: Routledge Press], highlighting the complexity of measuring teacher professional knowledge and skill.


American Educational Research Journal | 2015

An Efficacy Trial of Research-Based Curriculum Materials With Curriculum-Based Professional Development

Joseph A. Taylor; Stephen R. Getty; Susan Kowalski; Christopher D. Wilson; Janet Carlson; Pamela Van Scotter

This study examined the efficacy of a curriculum-based intervention for high school science students. Specifically, the intervention was two years of research-based, multidisciplinary curriculum materials for science supported by comprehensive professional development for teachers that focused on those materials. A modest positive effect was detected when comparing outcomes from this intervention to those of business-as-usual materials and professional development. However, this effect was typical for interventions at this grade span that are tested using a state achievement test. Tests of mediation suggest a large treatment effect on teachers and in turn a strong effect of teacher practice on student achievement—reinforcing the hypothesized key role of teacher practice. Tests of moderation indicate no significant treatment by demographic interactions.


AERA Open | 2016

Design Parameters for Impact Research in Science Education

Jessaca Spybrook; Carl D. Westine; Joseph A. Taylor

The Common Guidelines for Education Research and Development were created as a joint effort between the Institute of Education Science and the National Science Foundation in an effort to streamline education research and contribute to an accumulation of knowledge that will lead to improved student outcomes. One type of research that emerged in the guidelines is impact research. In order to achieve the level of rigor expected for an impact study, it is common that a research team will employ a cluster randomized trial (CRT). This article provides empirical estimates of design parameters necessary for planning adequately powered CRTs focused on science achievement. Examples of how to use these parameters to improve the design of science impact studies are discussed.


International Journal of STEM Education | 2017

Design Principles for Effective Video-Based Professional Development.

Kathleen Roth; Jody Bintz; Nicole I. Z. Wickler; Connie Hvidsten; Joseph A. Taylor; Paul M. Beardsley; Arlo Caine; Christopher D. Wilson

BackgroundMost studies of teacher professional development (PD) do not rigorously test impact on teaching practice and student learning. This makes it difficult to define what is truly “effective.” The Science Teachers Learning from Lesson Analysis (STeLLA) PD program, in contrast, was studied in a cluster randomized experimental design that examined impact on teaching practice and student learning. The STeLLA video-based PD (VbPD) program demonstrated significant impact, with high effect sizes, on elementary teachers’ science teaching practice and their students’ learning. Previously published reports provide details about research methods and findings but only broad sketches of the STeLLA program design and implementation. Deeper explorations of the STeLLA design principles can contribute evidence-based knowledge about the features of effective PD and enrich the existing but limited consensus model of effective PD. This article addresses the following questions:What design principles guided the development, implementation, leadership, and scaling up of a video-based PD program that had significant impact on student learning?What do the STeLLA design principles contribute to the existing knowledge base about effective video-based PD?ResultsResults from rigorous studies of the STeLLA program are summarized in this paper; details are reported elsewhere and included here as supplementary materials. This article is not a standard research results paper but instead describes the design principles guiding the development, implementation, leadership, and scaling up of the STeLLA VbPD program.ConclusionsThe authors argue that this set of design principles is powerful for four reasons: 1) its demonstrated impact on teaching practice and student learning, 2) its strong theoretical and research foundations, 3) the stability and usefulness of the design principles as implemented in changing contexts over a 10-year period, and 4) the coherence and interconnectedness of the principles. The STeLLA VbPD design principles contribute to the field by empirically supporting and advancing the existing consensus model of effective PD. Further study can build on this effort to strengthen our understanding of effective PD based on evidence of impact on teaching practice and student learning.


AERA Open | 2018

Investigating Science Education Effect Sizes: Implications for Power Analyses and Programmatic Decisions:

Joseph A. Taylor; Susan Kowalski; Joshua R. Polanin; Karen Askinas; Molly Stuhlsatz; Christopher D. Wilson; Elizabeth Tipton; Sandra Jo Wilson

A priori power analyses allow researchers to estimate the number of participants needed to detect the effects of an intervention. However, power analyses are only as valid as the parameter estimates used. One such parameter, the expected effect size, can vary greatly depending on several study characteristics, including the nature of the intervention, developer of the outcome measure, and age of the participants. Researchers should understand this variation when designing studies. Our meta-analysis examines the relationship between science education intervention effect sizes and a host of study characteristics, allowing primary researchers to access better estimates of effect sizes for a priori power analyses. The results of this meta-analysis also support programmatic decisions by setting realistic expectations about the typical magnitude of impacts for science education interventions.


Archive | 2006

The BSCS 5E Instructional Model: Origins and Effectiveness

Rodger W. Bybee; Joseph A. Taylor; April L. Gardner; Pamela Van


Journal of Research in Science Teaching | 2009

The relative effects and equity of inquiry‐based and commonplace science teaching on students' knowledge, reasoning, and argumentation

Christopher D. Wilson; Joseph A. Taylor; Susan Kowalski; Janet Carlson


Educational Leadership | 2006

Task, Text, and Talk: Literacy for All Subjects

Stephanie M. McConachie; Megan W. Hall; Lauren B. Resnick; Anita K. Ravi; Victoria Bill; Jody Bintz; Joseph A. Taylor


Journal of Research in Science Teaching | 2003

Secondary school physics teachers' conceptions of scientific evidence: An exploratory case study

Joseph A. Taylor; Thomas M. Dana


Science Educator | 2007

Bridging Research on Learning and Student Achievement: The Role of Instructional Materials.

Joseph A. Taylor; Pamela Van Scotter; Doug Coulson

Collaboration


Dive into the Joseph A. Taylor's collaboration.

Top Co-Authors

Avatar

Christopher D. Wilson

Biological Sciences Curriculum Study

View shared research outputs
Top Co-Authors

Avatar

Susan Kowalski

Biological Sciences Curriculum Study

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Molly Stuhlsatz

Biological Sciences Curriculum Study

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas M. Dana

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jessaca Spybrook

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar

Jody Bintz

Biological Sciences Curriculum Study

View shared research outputs
Top Co-Authors

Avatar

Pamela Van Scotter

Biological Sciences Curriculum Study

View shared research outputs
Researchain Logo
Decentralizing Knowledge