Springer Proceedings in Mathematics & Statistics | 2021
Multiple Answer Multiple Choice Items: A Problematic Item Type?
Abstract
Multiple Answer Multiple Choice items (MAMC) have been used primarily for high-stakes assessments but are being used in broader circumstances. Unfortunately, they are a complex format for examinees, item writers, and scorers. This article features two novel approaches to scoring these items while comparing a low ability and a high ability examinee group. One of these approaches is the use of Latent Class Analysis (LCA) to better understand how examinees respond to these item formats related to a “credit earned approach” to scoring MAMC items. In addition, Jaccard’s distance is proposed as an alternative partial credit scoring method for assigning credit to students on these items. Both have their benefits and drawbacks, which are discussed in the article; however, both exploratory approaches reveal more about examinee knowledge than simple dichotomous scoring.