Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Usama S. Ali is active.

Publication


Featured researches published by Usama S. Ali.


British Journal of Mathematical and Statistical Psychology | 2017

A comparison of item response models for accuracy and speed of item responses with applications to adaptive testing

Peter W. van Rijn; Usama S. Ali

We compare three modelling frameworks for accuracy and speed of item responses in the context of adaptive testing. The first framework is based on modelling scores that result from a scoring rule that incorporates both accuracy and speed. The second framework is the hierarchical modelling approach developed by van der Linden (2007, Psychometrika, 72, 287) in which a regular item response model is specified for accuracy and a log-normal model for speed. The third framework is the diffusion framework in which the response is assumed to be the result of a Wiener process. Although the three frameworks differ in the relation between accuracy and speed, one commonality is that the marginal model for accuracy can be simplified to the two-parameter logistic model. We discuss both conditional and marginal estimation of model parameters. Models from all three frameworks were fitted to data from a mathematics and spelling test. Furthermore, we applied a linear and adaptive testing mode to the data off-line in order to determine differences between modelling frameworks. It was found that a model from the scoring rule framework outperformed a hierarchical model in terms of model-based reliability, but the results were mixed with respect to correlations with external measures.


Applied Psychological Measurement | 2016

An Evaluation of Different Statistical Targets for Assembling Parallel Forms in Item Response Theory

Usama S. Ali; Peter W. van Rijn

Assembly of parallel forms is an important step in the test development process. Therefore, choosing a suitable theoretical framework to generate well-defined test specifications is critical. The performance of different statistical targets of test specifications using the test characteristic curve (TCC) and the test information function (TIF) was investigated. Test length, the number of test forms, and content specifications are considered as well. The TCC target results in forms that are parallel in difficulty, but not necessarily in terms of precision. Vice versa, test forms created using a TIF target are parallel in terms of precision, but not necessarily in terms of difficulty. As sometimes the focus is either on TIF or TCC, differences in either difficulty or precision can arise. Differences in difficulty can be mitigated by equating, but differences in precision cannot. In a series of simulations using a real item bank, the two-parameter logistic model, and mixed integer linear programming for automated test assembly, these differences were found to be quite substantial. When both TIF and TCC are combined into one target with manipulation to relative importance, these differences can be made to disappear.


ETS Research Report Series | 2018

SARM: A Computer Program for Estimating Speed-Accuracy Response Models for Dichotomous Items: SARM

Peter W. van Rijn; Usama S. Ali


Journal of Educational Measurement | 2017

Evaluating Statistical Targets for Assembling Parallel Mixed-Format Test Forms

Dries Debeer; Usama S. Ali; Peter W. van Rijn


ETS Research Report Series | 2015

Location Indices for Ordinal Polytomous Items Based on Item Response Theory

Usama S. Ali; Hua Hua Chang; Carolyn J. Anderson


ETS Research Report Series | 2015

Location Indices for Ordinal Polytomous Items Based on Item Response Theory. Research Report. ETS RR-15-20.

Usama S. Ali; Hua Hua Chang; Carolyn J. Anderson


ETS Research Report Series | 2014

Enhancing the Equating of Item Difficulty Metrics: Estimation of Reference Distribution

Usama S. Ali; Michael E. Walker


ETS Research Report Series | 2014

An Item-Driven Adaptive Design for Calibrating Pretest Items

Usama S. Ali; Hua Hua Chang


ETS Research Report Series | 2014

Enhancing the Equating of Item Difficulty Metrics: Estimation of Reference Distribution. Research Report. ETS RR-14-07.

Usama S. Ali; Michael E. Walker


ETS Research Report Series | 2014

An Item-Driven Adaptive Design for Calibrating Pretest Items. Research Report. ETS RR-14-38.

Usama S. Ali; Hua Hua Chang

Collaboration


Dive into the Usama S. Ali's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge