Louis V. DiBello
University of Illinois at Chicago
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Louis V. DiBello.
Handbook of Statistics | 2006
Louis V. DiBello; Louis Roussos; William Stout
This paper is divided into two main sections. The first half of the paper focuses on the intent and practice of diagnostic assessment, providing a general organizing scheme for a diagnostic assessment implementation process, from design to scoring. The discussion includes specific concrete examples throughout, as well as summaries of data studies as appropriate. The second half of the paper focuses on one critical component of the implementation process – the specification of an appropriate psychometric model. It includes the presentation of a general form for the models as an interaction of knowledge structure with item structure, a review of each of a variety of selected models, separate detailed summaries of knowledge structure modeling and item structure modeling, and lastly some summarizing and concluding remarks. To make the scope manageable, the paper is restricted to models for dichotomously scored items. Throughout the paper, practical advice is given about how to apply and implement the ideas and principles discussed.
Archive | 2007
Louis Roussos; Louis V. DiBello; William Stout; Sarah M. Hartz; Robert A. Henson; Jonathan H. Templin
INTRODUCTION There is a long history of calls for combining cognitive science and psychometrics (Cronbach, 1975; Snow & Lohman, 1989). The U.S. standards movement, begun more than 20 years ago (McKnight et al., 1987; National Council of Teachers of Mathematics, 1989), sought to articulate public standards for learning that would define and promote successful performance by all students; establish a common base for curriculum development and instructional practice; and provide a foundation for measuring progress for students, teachers and programs. The standards movement provided the first widespread call for assessment systems that directly support learning. For success, such systems must satisfy a number of conditions having to do with cognitive-science–based design, psychometrics, and implementation. This chapter focuses on the psychometric aspects of one particular system that builds on a carefully designed test and a user-selected set of relevant skills measured by that test to assess student mastery of each of the chosen skills. This type of test-based skills level assessment is called skills diagnosis . The system that the chapter describes in detail is called the Fusion Model system . This chapter focuses on the statistical and psychometric aspects of the Fusion Model system, with skills diagnosis researchers and practitioners in mind who may be interested in working with this system. We view the statistical and psychometric aspects as situated within a comprehensive framework for diagnostic assessment test design and implementation.
Educational Psychologist | 2016
James W. Pellegrino; Louis V. DiBello; Susan R. Goldman
Assessments that function close to classroom teaching and learning can play a powerful role in fostering academic achievement. Unfortunately, however, relatively little attention has been given to discussion of the design and validation of such assessments. The present article presents a framework for conceptualizing and organizing the multiple components of validity applicable to assessments intended for use in the classroom to support ongoing processes of teaching and learning. The conceptual framework builds on existing validity concepts and focuses attention on three components: cognitive validity, instructional validity, and inferential validity. The goal in presenting the framework is to clarify the concept of validity, including key components of the interpretive argument, while considering the types and forms of evidence needed to construct a validity argument for classroom assessments. The frameworks utility is illustrated by presenting an application to the analysis of the validity of assessments embedded within an elementary mathematics curriculum.
Archive | 2003
Louis V. DiBello; William Stout
This paper is adapted from an invited address delivered by Professor Stout and a symposium presentation delivered by Dr. DiBello at the 2001 International Meeting of the Psychometric Society, Osaka, Japan. This first IMPS meeting to be held in Japan was an auspicious occasion for bringing together statisticians and psychometricians from Japan with their North American and European colleagues. It provided an important forum for discussing new opportunities for assessment in the twenty-first century that result from a fortuitous conjunction of heightened public attention to school effectiveness and new psychometric methods that allow the practical operationalization of more complex cognitive models. In this paper we recall the term formative assessment as it is used in education, and define a class of scoring procedures called student profile scoring. We describe the formative aspects of the mostly summative US No Child Left Behind legislation. We outline the simple cognitive modeling that is reflected in the reparameterized unified model. We close with a call to psychometricians for a paradigm shift that moves the testing industry beyond an almost exclusive focus on low dimensional, data reductionist methods to include student profile scoring based on richer, substantively-grounded models.
Journal of Educational Measurement | 2007
Russell G. Almond; Louis V. DiBello; Brad Moulder; Juan-Diego Zapata-Rivera
Journal of Educational Measurement | 2007
Louis V. DiBello; William Stout
Archive | 2010
Louis Roussos; Louis V. DiBello; Robert A. Henson; Eunice Jang; Jonathan Templin
2010 Annual Conference & Exposition | 2010
Aidsa Santiago Roman; Ruth A. Streveler; Paul S. Steif; Louis V. DiBello
Journal of Engineering Education | 2015
Natalie Jorion; Brian Douglas Gane; Katie James; Lianne Schroeder; Louis V. DiBello; James W. Pellegrino
Archive | 2014
James W. Pellegrino; Louis V. DiBello; Sean P. Brophy