ERIC Number: EJ1195164
Record Type: Journal
Publication Date: 2018
Pages: 17
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1536-6367
EISSN: N/A
Attribute-Level Item Selection Method for DCM-CAT
Bao, Yu; Bradshaw, Laine
Measurement: Interdisciplinary Research and Perspectives, v16 n4 p209-225 2018
Diagnostic classification models (DCMs) can provide multidimensional diagnostic feedback about students' mastery levels of knowledge components or attributes. One advantage of using DCMs is the ability to accurately and reliably classify students into mastery levels with a relatively small number of items per attribute. Combining DCMs with computerized adaptive testing can further shorten a test by strategically administering different items to different examinees. Current studies about item selection methods select the next item to increase the classification accuracy for the overall attribute profile and have been explored with item pools that have equal attribute information for all attributes on the assessment. In practice, the attribute information for diagnostic assessment is usually not balanced in the item pool. We propose a new attribute-level item selection method based on Cognitive Diagnostic Index at the Attribute Level (CDI_A; Henson et al., 2008) that helps balance classification accuracies among attributes on an assessment when item pools are not balanced across attributes. We conducted simulation studies to compare the performance of the CDI_A to other leading item selection methods; a pair of studies was theoretically based, and the last study was empirically based. Results showed that the new method can increase the classification accuracy and the reliability for attributes with weaker items in the item pool by administering more items to measure the attribute. Although using fewer items, the method retains reasonable accuracies for the attributes with stronger items in the pool. Thus, the CDI_A provides a trade-off to maintain an acceptable level of estimation accuracy for all attributes.
Descriptors: Test Items, Selection, Adaptive Testing, Computer Assisted Testing, Classification, Test Length, Accuracy, Item Banks, Methods, Reliability
Routledge. Available from: Taylor & Francis, Ltd. 530 Walnut Street Suite 850, Philadelphia, PA 19106. Tel: 800-354-1420; Tel: 215-625-8900; Fax: 215-207-0050; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A