NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1101444
Record Type: Journal
Publication Date: 2016
Pages: 15
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-0895-7347
EISSN: N/A
Available Date: N/A
Evaluating the Psychometric Characteristics of Generated Multiple-Choice Test Items
Gierl, Mark J.; Lai, Hollis; Pugh, Debra; Touchie, Claire; Boulais, André-Philippe; De Champlain, André
Applied Measurement in Education, v29 n3 p196-210 2016
Item development is a time- and resource-intensive process. Automatic item generation integrates cognitive modeling with computer technology to systematically generate test items. To date, however, items generated using cognitive modeling procedures have received limited use in operational testing situations. As a result, the psychometric characteristics of generated multiple-choice test items are largely unknown and undocumented. We present item analysis results from one of the first empirical studies designed to evaluate the psychometric properties of generated multiple-choice items using the results from a high stakes national medical licensure examination. The item analysis results for the correct option revealed that the generated items measured examinees' performance across a broad range of ability levels while, at the same time, providing a consistently strong level of discrimination for each item. Results for the incorrect options revealed that the generated items consistently differentiated the low from the high performing examinees.
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site: http://www.tandf.co.uk/journals
Publication Type: Journal Articles; Reports - Research
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Canada
Grant or Contract Numbers: N/A
Author Affiliations: N/A