Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James C. Impara is active.

Publication


Featured researches published by James C. Impara.


Archive | 1996

Teacher Assessment Literacy: What Do Teachers Know about Assessment?

Barbara S. Plake; James C. Impara

Publisher Summary It is estimated that teachers spend up to 50% of their instructional time in assessment-related activities. The chapter discusses the teacher assessment literacy—what teachers actually know about assessment. It has been found that teachers receive little or no formal assessment training in the preparatory programs and often they are ill-prepared to undertake assessment-related activities. With the introduction of “authentic” assessment strategies, it is more important for teachers to be more skilled in assessment. This is so because they often are involved directly in the administration and scoring of these assessments. Some studies have attempted to quantify the level of teacher preparation in educational assessment of students. The chapter highlights the study of a national survey of teacher assessment literacy. Also, it presents a more detailed analysis of teacher performance on the instrument. The results of this study indicate low levels of assessment competency for teachers. These results suggest that it is time for the education community to recognize that teachers are ill-equipped to successfully undertake one of the most prevalent activities of their instructional program: student assessment. This is especially salient due to the current trend in student assessment, involving an increase in assessment strategies such as performance, portfolio, and other types of “authentic assessments.”


Educational Assessment | 2001

Ability of Panelists to Estimate Item Performance for a Target Group of Candidates: An Issue in Judgmental Standard Setting

Barbara S. Plake; James C. Impara

Recent researchers (Impara & Plake, 1998; National Research Council, 1999; Shepard, 1995) have called into question the ability of judges to make accurate item performance estimates for target subgroups of candidates, such as minimally competent candidates. The purpose of this study was to examine both the reliability and accuracy of item performance estimates from an Angoff (1971) standard setting application. Results provide evidence that item performance estimates were both reasonable and reliable. Factors that might have influenced these results are discussed.


Educational Assessment | 2000

Making the Cut in School Districts: Alternative Methods for Setting Cutscores

Gerald Giraud; James C. Impara; Chad W. Buckendahl

School districts are under increasing pressure to demonstrate that students are competent in various skills, such as reading and mathematics. Often, demonstrating competence involves comparing performance on assessments to a standard of performance, as embodied in a test score. These scores, called cutscores, separate competent and noncompetent examinees. Because school districts have varied sources of data to inform cutscore decisions, various methods are available for suggesting cutscores. In 2 studies, we examine a selection of methods for arriving at rational and defensible cutscores in school districts. Methods examined are the Angoff (1971) method; the borderline and contrasting groups methods; and 2 new methods, 1 based on course enrollment and 1 based on expert expectations. In Study 1, the Angoff, borderline group, and course enrollment results were consistent, whereas in Study 2, the Angoff and professional judgment methods yielded suggested cutscores that were lower than the borderline group method. Suggestions for further study include the reaction of teachers to the cutscore-setting methods, the effect of different teacher attributes on the results of cutscore-setting methods, and the efficiency of and most effective order for employing the various methods.


Archive | 2017

Closing the Loop: Providing Test Developers with Performance Level Descriptors So Standard Setters Can Do Their Job

Amanda A. Wolkowitz; James C. Impara; Chad W. Buckendahl

Standard setting panels are tasked with recommending one or more performance level standards for assessments that are used to classify students into ability categories. These assessments are sometimes developed with the performance level descriptors known and other times without these descriptors. Based on an analysis of 11 state, educational, alternative assessments, this chapter investigates the effects on the standard setting process of developing a test both with and without these descriptors. The results suggest that the standard setting panelists are more consistent with one another and more aligned with empirical data when the items were developed with the descriptors in mind.


Archive | 1996

Teacher Assessment Literacy

Barbara S. Plake; James C. Impara

Publisher Summary It is estimated that teachers spend up to 50% of their instructional time in assessment-related activities. The chapter discusses the teacher assessment literacy—what teachers actually know about assessment. It has been found that teachers receive little or no formal assessment training in the preparatory programs and often they are ill-prepared to undertake assessment-related activities. With the introduction of “authentic” assessment strategies, it is more important for teachers to be more skilled in assessment. This is so because they often are involved directly in the administration and scoring of these assessments. Some studies have attempted to quantify the level of teacher preparation in educational assessment of students. The chapter highlights the study of a national survey of teacher assessment literacy. Also, it presents a more detailed analysis of teacher performance on the instrument. The results of this study indicate low levels of assessment competency for teachers. These results suggest that it is time for the education community to recognize that teachers are ill-equipped to successfully undertake one of the most prevalent activities of their instructional program: student assessment. This is especially salient due to the current trend in student assessment, involving an increase in assessment strategies such as performance, portfolio, and other types of “authentic assessments.”


Handbook of Classroom Assessment#R##N#Learning, Achievement, and Adjustment | 1996

Chapter 3 – Teacher Assessment Literacy: What Do Teachers Know about Assessment?

Barbara S. Plake; James C. Impara

Publisher Summary It is estimated that teachers spend up to 50% of their instructional time in assessment-related activities. The chapter discusses the teacher assessment literacy—what teachers actually know about assessment. It has been found that teachers receive little or no formal assessment training in the preparatory programs and often they are ill-prepared to undertake assessment-related activities. With the introduction of “authentic” assessment strategies, it is more important for teachers to be more skilled in assessment. This is so because they often are involved directly in the administration and scoring of these assessments. Some studies have attempted to quantify the level of teacher preparation in educational assessment of students. The chapter highlights the study of a national survey of teacher assessment literacy. Also, it presents a more detailed analysis of teacher performance on the instrument. The results of this study indicate low levels of assessment competency for teachers. These results suggest that it is time for the education community to recognize that teachers are ill-equipped to successfully undertake one of the most prevalent activities of their instructional program: student assessment. This is especially salient due to the current trend in student assessment, involving an increase in assessment strategies such as performance, portfolio, and other types of “authentic assessments.”


Journal of Educational Measurement | 1998

Teachers' Ability To Estimate Item Difficulty: A Test of the Assumptions in the Angoff Standard Setting Method.

James C. Impara; Barbara S. Plake


Educational Measurement: Issues and Practice | 2005

Aligning Tests with States' Content Standards: Methods and Issues

Dennison S. Bhola; James C. Impara; Chad W. Buckendahl


Journal of Educational Measurement | 1997

Standard Setting: An Alternative Approach.

James C. Impara; Barbara S. Plake


Journal of Educational Measurement | 2001

The Impact of Omitted Responses on the Accuracy of Ability Estimation in Item Response Theory

R. J. Ayala; Barbara S. Plake; James C. Impara

Collaboration


Dive into the James C. Impara's collaboration.

Top Co-Authors

Avatar

Barbara S. Plake

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

Chad W. Buckendahl

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

Patrick M. Irwin

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

Barry T. Rosson

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jennifer J. Fager

South Dakota State University

View shared research outputs
Top Co-Authors

Avatar

Maria T. Potenza

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

R. J. Ayala

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar

Ronald K. Hambleton

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Vicki L. Wise

University of Nebraska–Lincoln

View shared research outputs
Researchain Logo
Decentralizing Knowledge