ERIC Number: ED334210
Record Type: Non-Journal
Publication Date: 1991-Apr
Pages: 27
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
An Empirical Comparison of an Expert Systems Approach and an IRT Approach to Computer-Based Adaptive Mastery Testing.
Luk, HingKwan
This study examined whether an expert system approach involving intelligent selection of items (EXSPRT-I) is as efficient as item response theory (IRT) based three-parameter adaptive mastery testing (AMT) when there are enough subjects to estimate the three IRT item parameters for all items in the test and when subjects in the item parameter estimation (IPE) sample are different from those in the validation sample. EXSPRT is based on expert systems reasoning, Bayesian reasoning, and A. Wald's sequential probability ratio test. The 100-item multiple-choice College Entrance Examination Board Spanish Achievement Test (CEEBSAT) was administered to 1,672 undergraduate students at Indiana University (Bloomington): 1,000 examinees were assigned to an IPE group, and 672 examinees were assigned to a validation group. The CEEBSAT is a paper-and-pencil test consisting of a 40-item listening subtest and a 60-item reading subtest. IPE and test reenactments were conducted separately for the two subtests. The two subject groups were essentially the same with respect to the two subtests. Although EXSPRT-I decision accuracies were not as high as expected, they were reasonably high for many practical purposes. EXSPRT-I required the least number of test items to reach a mastery or non-mastery decision for non-masters, followed by AMT, and then EXSPRT with random selection of items. EXSPRT-I is more accurate and more efficient in making mastery decisions than AMT; however, it is less accurate but more efficient than AMT for non-mastery decisions. EXSPRT is less complicated conceptually and mathematically and is more practical since it does not require a large parameter estimation sample. Three data tables and a 13-item list of references are included. Study formulas and algorithms are presented. (RLC)
Descriptors: Achievement Tests, Adaptive Testing, College Entrance Examinations, Comparative Analysis, Computer Assisted Testing, Estimation (Mathematics), Expert Systems, Higher Education, Item Response Theory, Language Tests, Mastery Tests, Mathematical Models, Simulation, Spanish, Undergraduate Students
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Assessments and Surveys: College Board Achievement Tests
Grant or Contract Numbers: N/A