NotesFAQContact Us
Search Tips
ERIC Number: ED210314
Record Type: Non-Journal
Publication Date: 1981-Aug
Pages: 190
Abstractor: N/A
Reference Count: N/A
Methods for Linking Item Parameters. Final Report.
Vale, C. David; And Others
A simulation study to determine appropriate linking methods for adaptive testing items was designed. Three basic data sets for responses were created. These were randomly sampled, systematically sampled, and selected data sets. The evaluative criteria used were fidelity of parameter estimation, asymptotic ability estimates, root-mean-square error of estimates, and the correlation between true and estimated ability. Test length appeared more important to calibration effectiveness than sample size. Efficiency analyses suggested that increases in test length were several times as effective in improving calibration efficiency as proportionate increases in calibration sample sizes. The asymptotic ability analyses suggested that the linking procedures based on Bayesian ability estimation were more effective. The equivalent-tests method was no better than not linking. Bayesian scoring procedures were slightly superior to the others tested. Efficiency loss due to linking error was less than that due to item calibration error. Test length and sample size had a definite effect on calibration efficiency but no strong effects appear with respect to linking efficiency. For the systematically sampled data set, the anchor-test method produced the most efficient item pools in terms of linking efficiency. Bayesian scoring was preferred over the maximum likelihood scoring procedure. (Author/DWH)
Publication Type: Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: Air Force Human Resources Lab., Brooks AFB, TX.
Authoring Institution: Assessment Systems Corp., St. Paul, Minn.