Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Linda S. Steinberg is active.

Publication


Featured researches published by Linda S. Steinberg.


Measurement: Interdisciplinary Research & Perspective | 2003

Focus Article: On the Structure of Educational Assessments

Robert J. Mislevy; Linda S. Steinberg; Russell G. Almond

In educational assessment, we observe what students say, do, or make in a few particular circumstances and attempt to infer what they know, can do, or have accomplished more generally. A web of inference connects the two. Some connections depend on theories and experience concerning the targeted knowledge in the domain, how it is acquired, and the circumstances under which people bring their knowledge to bear. Other connections may depend on statistical models and probability-based reasoning. Still others concern the elements and processes involved in test construction, administration, scoring, and reporting. This article describes a framework for assessment that makes explicit the interrelations among substantive arguments, assessment designs, and operational processes. The work was motivated by the need to develop assessments that incorporate purposes, technologies, and psychological perspectives that are not well served by familiar forms of assessments. However, the framework is equally applicable to analyzing existing assessments or designing new assessments within familiar forms.


Language Testing | 2002

Design and Analysis in Task-Based Language Assessment.

Robert J. Mislevy; Linda S. Steinberg; Russell G. Almond

In task-based language assessment (TBLA) language use is observed in settings that are more realistic and complex than in discrete skills assessments, and which typically require the integration of topical, social and/or pragmatic knowledge along with knowledge of the formal elements of language. But designing an assessment is not accomplished simply by determining the settings in which performance will be observed. TBLA raises questions of just how to design complex tasks, evaluate students’ performances and draw valid conclusions therefrom. This article examines these challenges from the perspective of ‘evidence-centred assessment design’. The main building blocks are student, evidence and task models, with tasks to be administered in accordance with an assembly model. We describe these models, show how they are linked and assembled to frame an assessment argument and illustrate points with examples from task-based language assessment.


Applied Measurement in Education | 2002

Making Sense of Data From Complex Assessments

Robert J. Mislevy; Linda S. Steinberg; F. Jay Breyer; Russell G. Almond; Lynn Johnson

Advances in cognitive psychology both deepen our understanding of how students gain and use knowledge and broaden the range of performances and situations we want to see to acquire evidence about their developing knowledge. At the same time, advances in technology make it possible to capture more complex performances in assessment settings by including, as examples, simulation, interactivity, and extended responses. The challenge is making sense of the complex data that result. This article concerns an evidence-centered approach to the design and analysis of complex assessments. We present a design framework that incorporates integrated structures for a modeling knowledge and skills, designing tasks, and extracting and synthesizing evidence. The ideas are illustrated in the context of a project with the Dental Interactive Simulation Corporation (DISC), assessing problem solving in dental hygiene with computer-based simulations. After reviewing the substantive grounding of this effort, we describe the design rationale, statistical and scoring models, and operational structures for the DISC assessment prototype.


Computers in Human Behavior | 1999

A cognitive task analysis with implications for designing simulation-based performance assessment☆

Robert J. Mislevy; Linda S. Steinberg; F.J. Breyer; Russell G. Almond; L. Johnson

To function effectively as a learning environment, a simulation system must present learners with situations in which they use relevant knowledge, skills, and abilities. To function effectively as an assessment, such a system must additionally be able to evoke and interpret observable evidence about targeted knowledge in a manner that is principled, defensible, and fitting to the purpose at hand (e.g. licensure, achievement testing, coached practice). This article concerns an evidence-centered approach to designing a computer-based performance assessment of problem solving. The application is a prototype licensure test, with supplementary feedback, for prospective use in the field of dental hygiene. We describe a cognitive task analysis designed to: (1) tap the knowledge hygienists use when they assess patients, plan treatments, and monitor progress; and (2) elicit behaviors that manifest this knowledge. After summarizing the results of the analysis, we discuss implications for designing student models, evidentiary structures, task frameworks, and simulation capabilities required for the proposed assessment.


International Journal of Testing | 2004

Design Rationale for a Complex Performance Assessment

David M. Williamson; Malcolm Bauer; Linda S. Steinberg; Robert J. Mislevy; John T. Behrens; Sarah F. Demark

In computer-based interactive environments meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function effectively as an assessment, a computer system must additionally be able to evoke and interpret observable evidence about targeted knowledge in a manner that is principled, defensible, and suited to the purpose at hand (e.g., licensure, achievement testing, coached practice). This article describes the foundations for the design of an interactive computer-based assessment of design, implementation, and troubleshooting in the domain of computer networking. The application is a prototype for assessing these skills as part of an instructional program, as interim practice tests and as chapter or end-of-course assessments. An Evidence Centered Design (ECD) framework was used to guide the work. An important part of this work is a cognitive task analysis designed (a) to tap the knowledge computer network specialists and students use when they design and troubleshoot networks and (b) to elicit behaviors that manifest this knowledge. After summarizing its results, we discuss implications of this analysis, as well as information gathered through other methods of domain analysis, for designing psychometric models, automated scoring algorithms, and task frameworks and for the capabilities required for the delivery of this example of a complex computer-based interactive assessment.


Applied Measurement in Education | 2010

The Promises and Challenges of Implementing Evidence-Centered Design in Large-Scale Assessment

Kristen Huff; Linda S. Steinberg; Thomas Matts

The cornerstone of evidence-centered assessment design (ECD) is an evidentiary argument that requires that each target of measurement (e.g., learning goal) for an assessment be expressed as a claim to be made about an examinee that is relevant to the specific purpose and audience(s) for the assessment. The observable evidence required to warrant each claim is also articulated. In turn, the claims and evidence shape the design of assessment opportunities for students to demonstrate what they have learned, whether that opportunity is a classroom activity or a multiple-choice item on a high-stakes assessment. Once identified, the characteristics of these assessment opportunities are referred to as task models, each capable of generating multiple assessment tasks. Taken together, the claims, evidence, and task models constitute the evidentiary argument. The benefits and challenges of implementing ECD in the Advanced Placement Program are addressed.


intelligent user interfaces | 1993

Cognitive task analysis, interface design, and technical troubleshooting

Linda S. Steinberg; Drew H. Gitomer

We propose a model of interface design that makes use of two interdependent levels of cognitive analysis: 1) the study of the criterion task through an analysis of expert/novice differences and; 2) the application of a GOMS analysis to a working interface design. We review this dual analysis in the context of HYDRIVE, a video-disc based intelligent tutoring system designed to facilitate the development of troubleshooting skills for aircraft hydraulics systems. The initial cognitive task analysis enabled the identification of criticat troubleshooting skills and troubleshooting procedures. We find, though, that even with an in-depth initial cognitive task analysis, the GOMS interface analysis resulted in significant and beneficial design changes.


Knowledge Based Systems | 1993

Cognitive task analysis and interface design in a technical troubleshooting domain

Linda S. Steinberg; Drew H. Gitomer

A model of the interface design process is proposed that uses two interdependent levels of cognitive analysis: (a) study of the criterion task through the analysis of expert/novice differences, and (b) evaluation of the working user interface design through the application of practical interface analysis methodology (the GOMS model). This dual analysis is reviewed in the context of HYDRIVE, an intelligent tutoring system that has been designed to facilitate the development of aircraft hydraulics systems troubleshooting skills. Initial cognitive task analyses identified critical troubleshooting skills and procedures. However, even with an initial cognitive task analysis, the GOMS analysis resulted in significant and beneficial design changes.


conference on artificial intelligence for applications | 1993

A generalizable architecture for building intelligent tutoring systems

Randy M. Kaplan; Harriet Trenholm; Drew H. Gitomer; Linda S. Steinberg

Summary form only given. HYDRIVE (HYDRaulics Interactive Video Experience) is an intelligent tutoring system incorporating assessment as part of the tutoring process. HYDRIVE departs from typical intelligent tutoring systems in that its underlying architecture is domain independent. The domain that is presently supported by HYDRIVE is troubleshooting for the hydraulics supported systems of an Air Force F-15 fighter aircraft. HYDRIVE addresses both system-specific and system-independent skills. Several novel approaches to intelligent tutoring underlie the rationale for HYDRIVE. The reasoning component of the system makes extensive use of a hierarchical knowledge representation. Reasoning within the system is accomplished using a logic-based approach and is linked to a highly interactive interface using multimedia.<<ETX>>


Archive | 2003

A Framework for Reusing Assessment Components

Russell G. Almond; Linda S. Steinberg; Robert J. Mislevy

The purpose of an assessment determines a myriad of details about the delivery, presentation, and scoring of that assessment and consequently the authoring of tasks for that assessment. This paper explores the relationship between design requirements and authoring through the use of the Four Process Framework for Assessment Delivery. This ideal assessment delivery architecture describes an assessment delivery environment in terms of four processes: Activity Selection—The process which picks the next task or item; Presentation —The process which presents the item and captures the work product produced by the participant; Response Processing—The process which examines the participants response to a particular task and sets the values of observable outcome variables based on that response; and Summary Scoring—The process which is responsible for accumulating evidence across many tasks in an assessment. This framework has proved useful in our own work in Evidence Centered Design, and is being adopted as part of the IMS Global Consortium’s specification for Question and Test Interoperability.

Collaboration


Dive into the Linda S. Steinberg's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Howard Wainer

National Board of Medical Examiners

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge