Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daisy Wise Rutstein is active.

Publication


Featured researches published by Daisy Wise Rutstein.


Archive | 2017

Evidence-Centered Assessment Design

Robert J. Mislevy; Geneva Haertel; Michelle Riconscente; Daisy Wise Rutstein; Cindy Ziker

Design patterns are tools to support task authoring under an evidence-centered approach to assessment design (ECD). This chapter reviews the basic concepts of ECD, focusing on evidentiary arguments. It defines the attributes of design patterns, and shows the roles they play in creating tasks around valid assessment arguments.


Educational Research and Evaluation | 2013

A “conditional” sense of fairness in assessment

Robert J. Mislevy; Geneva Haertel; Britte H. Cheng; Liliana Ructtinger; Angela Haydel DeBarger; Elizabeth Murray; David Rose; Jenna W. Gravel; Alexis M. Colker; Daisy Wise Rutstein; Terry Vendlinski

Standardizing aspects of assessments has long been recognized as a tactic to help make evaluations of examinees fair. It reduces variation in irrelevant aspects of testing procedures that could advantage some examinees and disadvantage others. However, recent attention to making assessment accessible to a more diverse population of students highlights situations in which making tests identical for all examinees can make a testing procedure less fair: Equivalent surface conditions may not provide equivalent evidence about examinees. Although testing accommodations are by now standard practice in most large-scale testing programmes, for the most part these practices lie outside formal educational measurement theory. This article builds on recent research in universal design for learning (UDL), assessment design, and psychometrics to lay out the rationale for inference that is conditional on matching examinees with principled variations of an assessment so as to reduce construct-irrelevant demands. The present focus is assessment for special populations, but it is argued that the principles apply more broadly.


Educational and Psychological Measurement | 2009

The Potential for Differential Findings among Invariance Testing Strategies for Multisample Measured Variable Path Models.

Daisy Wise Rutstein; Gregory R. Hancock

Multisample measured variable path analysis is used to test whether causal/structural relations among measured variables differ across populations. Several invariance testing approaches are available for assessing cross-group equality of such relations, but the associated test statistics may vary considerably across methods. This study is a population analysis, examining five different strategies for invariance testing using an illustrative measured variable path model. The results demonstrate how inferences about parameters across populations can depend greatly upon the invariance testing approach used, thereby potentially leading to improper inference regarding the true invariance status of relevant parameters.


Archive | 2012

A Bayesian Network Approach To Modeling Learning Progressions

Patti West; Daisy Wise Rutstein; Robert J. Mislevy; Junhui Liu; Roy Levy; Kristen E. DiCerbo; A.V. Crawford; Younyoung Choi; Kristina Chapple; John T. Behrens

A central challenge in using learning progressions (LPs) in practice is modeling the relationships that link student performance on assessment tasks to students’ levels on the LP. On the one hand, there is a progression of theoretically defined levels, each defined by a configuration of knowledge, skills, and/or abilities (KSAs). On the other hand, there are observed performances on assessment tasks, associated with levels but only imperfectly and subject to inconsistencies. What is needed is a methodology that can be used to map assessment performance onto the levels, to combine information across multiple tasks measuring similar and related KSAs, to support inferences about students, and to study how well actual data exhibit the relationships posited by the LP.


Archive | 2017

Design Patterns for Model-Based Reasoning

Robert J. Mislevy; Geneva Haertel; Michelle Riconscente; Daisy Wise Rutstein; Cindy Ziker

The aspects of model-based reasoning serve as the Focal knowledge, skills and abilities (KSAs) of the design patterns. They highlight distinct aspects of model-based reasoning in a way that supports either focused tasks (building on one or a few design patterns) or more extensive investigations (building jointly on several design patterns). This chapter overviews the design-pattern perspective on assessing model-based reasoning, as a prelude to the next chapters that look more closely at each aspect. A table charts the correspondence between the aspects addressed in the design patterns and practices in the Next Generation Science Standards.


Archive | 2017

Model-Based Reasoning

Robert J. Mislevy; Geneva Haertel; Michelle Riconscente; Daisy Wise Rutstein; Cindy Ziker

Model-based reasoning consists of cycles of proposing, instantiating, checking, revising to find an apt model for a given purpose in a given situation, and reasoning about the situation through the model. Results from cognitive research can help us understand and assess both the experiential and reflective aspects of model-based reasoning. This chapter reviews research on model-based reasoning and the inquiry cycle to define aspects of model-based reasoning that can be used to guide assessment design.


Archive | 2017

Model-Based Inquiry

Robert J. Mislevy; Geneva Haertel; Michelle Riconscente; Daisy Wise Rutstein; Cindy Ziker

Model-based inquiry highlights the metacognitive aspects of managing and moving effectively through cycles of inquiry. The Focal KSAs in this design pattern are students’ capabilities to manage their reasoning across in inquiry cycles. A key Variable Task Feature to consider is the degree of scaffolding to provide students as they move from one aspect of an inquiry to another. All the considerations, design choices, work products, and observations addressed in the preceding design patterns can be involved in a model-based inquiry task.


The Journal of Technology, Learning and Assessment | 2010

On the Roles of External Knowledge Representations in Assessment Design.

Robert J. Mislevy; John T. Behrens; Randy E. Bennett; Sarah F. Demark; Dennis C. Frezzo; Roy Levy; Daniel H. Robinson; Daisy Wise Rutstein; Valerie J. Shute; Ken Stanley; Fielding I. Winters


Learning Progressions in Science Conference | 2009

A BAYES NET APPROACH TO MODELING LEARNING PROGRESSIONS AND TASK PERFORMANCES

Patti West; Daisy Wise Rutstein; Robert J. Mislevy; Junhui Liu; Roy Levy; Kristen E. DiCerbo; A.V. Crawford; Younyoung Choi; John T. Behrens


technical symposium on computer science education | 2016

What Is A Computer: What do Secondary School Students Think?

Shuchi Grover; Daisy Wise Rutstein; Eric Snow

Collaboration


Dive into the Daisy Wise Rutstein's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roy Levy

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

A.V. Crawford

Arizona State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge