Karen Peterman
Durham University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Karen Peterman.
Visitor Studies | 2015
Karen Peterman; Denise Young
ABSTRACT This article documents the development and use of a mystery shopper protocol for observing scientists’ interactions with the public within the context of science festival expo booths. A team of field researchers was trained to act as members of the general public and approach scientists participating in a science festival. After visiting each booth, the mystery shoppers documented their experiences; 192 booths were observed and data were compared with those from 186 intercept surveys collected from members of the public attending the expo. Descriptive statistics document the range of booth logistics and scientist interactions captured, and inferential statistics explore the relation between protocol items and event ratings. The results indicate that mystery shopping is an effective way to document scientists’ interactions with the public, providing a unique perspective that serves as a measure of quality control when compared with best practices. The utility and implications of mystery shopper data are discussed in relation to evaluation of and research on public science events in particular and informal learning institutions overall.
International journal of environmental and science education | 2018
Jane Robertson Evia; Karen Peterman; Emily Cloyd; John C. Besley
ABSTRACT Self-efficacy, or the beliefs people hold about their ability to succeed in certain pursuits, is a long-established construct. Self-efficacy for science communication distinguishes scientists who engage with the public and relates to scientists’ attitudes about the public. As such, self-efficacy for public engagement has the potential to serve as a key indicator in the evaluation of scientist training and public outreach programs. To date, most evaluation scales have been designed for public audiences, rather than scientists. This study used think-aloud methods and Item Response Theory to develop a scale to measure scientists’ Self-Efficacy for Public Engagement with Science. The results from this study support the use of a 13-item self-efficacy scale, and provide initial validation evidence to support its use with scientists who engage with the public. The findings are presented in relation to the continued study of public engagement through both research and evaluation.
International Journal of Science Education | 2017
Karen Peterman; Jenny Daugherty; Rodney L. Custer; Julia M. Ross
ABSTRACT Science teachers are being called on to incorporate engineering practices into their classrooms. This study explores whether the Engineering-Infused Lesson Rubric, a new rubric designed to target best practices in engineering education, could be used to evaluate the extent to which engineering is infused into online science lessons. Eighty lessons were selected at random from three online repositories, and coded with the rubric. Overall results documented the strengths of existing lessons, as well as many components that teachers might strengthen. In addition, a subset of characteristics was found to distinguish lessons with the highest level of engineering infusion. Findings are discussed in relation to the potential of the rubric to help teachers use research evidence-informed practice generally, and in relation to the new content demands of the U.S. Next Generation Science Standards, in particular.
International Journal of Science Education | 2015
Karen Peterman; Kayla A. Cranston; Marie Pryor; Ruth Kermish-Allen
This case study was conducted within the context of a place-based education project that was implemented with primary school students in the USA. The authors and participating teachers created a performance assessment of standards-aligned tasks to examine 6–10-year-old students’ graph interpretation skills as part of an exploratory research project. Fifty-five students participated in a performance assessment interview at the beginning and end of a place-based investigation. Two forms of the assessment were created and counterbalanced within class at pre and post. In situ scoring was conducted such that responses were scored as correct versus incorrect during the assessments administration. Criterion validity analysis demonstrated an age-level progression in student scores. Tests of discriminant validity showed that the instrument detected variability in interpretation skills across each of three graph types (line, bar, dot plot). Convergent validity was established by correlating in situ scores with those from the Graph Interpretation Scoring Rubric. Students’ proficiency with interpreting different types of graphs matched expectations based on age and the standards-based progression of graphs across primary school grades. The assessment tasks were also effective at detecting pre–post gains in students’ interpretation of line graphs and dot plots after the place-based project. The results of the case study are discussed in relation to the common challenges associated with performance assessment. Implications are presented in relation to the need for authentic and performance-based instructional and assessment tasks to respond to the Common Core State Standards and the Next Generation Science Standards.
CBE- Life Sciences Education | 2018
Karen Peterman; Kelley Withy; Rachel Boulay
A common challenge in the evaluation of K–12 science education is identifying valid scales that are an appropriate fit for both a student’s age and the educational outcomes of interest. Though many new scales have been validated in recent years, there is much to learn about the appropriate educational contexts and audiences for these measures. This study investigated two such scales, the DEVISE Self-Efficacy for Science scale and the Career Interest Questionnaire (CIQ), within the context of two related health sciences projects. Consistent patterns were found in the reliability of each scale across three age groups (middle school, high school, early college) and within the context of each project. As expected, self-efficacy and career interest, as measured through these scales, were found to be correlated. The pattern of results for CIQ scores was also similar to that reported in other literature. This study provides examples of how practitioners can validate established measures for new and specific contexts and provides some evidence to support the use of the scales studied in health science education contexts.
Science Communication | 2017
Karen Peterman; Jane Robertson Evia; Emily Cloyd; John C. Besley
This study presents initial work to validate a scale designed to measure scientists’ outcome expectations in relation to public engagement. A 20-item survey was administered to a sample of 341 scientists. Graded response models were used to assess the quality of the items. Results suggest that six items provided the strongest measure of outcome expectations, with classically adequate reliability across a wide range of scientists and scores. The findings are presented in relation to the short-term outcomes of public engagement for scientists and the need for validated scales that allow for the continued study of science communication efforts.
Journal of Science Communication | 2016
Christine Bevc; Denise Young; Karen Peterman
This study applies social network analysis to explore the role that one science festival has played in building the state’s STEM learning ecosystem. It examines the breadth and extent of collaboration among STEM educators and their partners, reviewing past and present partnerships across 449 events during the 2015 festival. Three case studies provide in-depth illustrations of partnerships. These findings represent an important step towards (a) mapping a STEM learning ecosystem, and (b) trying to understand how a festival affects the ecosystem itself. Together, study results demonstrate how the festival has served to stimulate and foster STEM partnerships. Abstract
Journal of Science Education and Technology | 2016
Karen Peterman; Ruth Kermish-Allen; Gerald Knezek; Rhonda Christensen; Tandra Tyler-Wood
Archive | 2018
Karen Peterman; Elana Kimbrell; Emily Cloyd; Jane Robertson Evia; John C. Besley
Journal of STEM Education: Innovations and Research | 2018
Julia Myers Ross; Karen Peterman; Jenny Daugherty; Rodney L. Custer