NotesFAQContact Us
Search Tips
ERIC Number: ED346131
Record Type: Non-Journal
Publication Date: 1992
Pages: 18
Abstractor: N/A
A Factor Analytic Item Response Theory Approach for Relating Item Content to Test Scores.
Abdel-fattah, Abdel-fattah A.
A scaling procedure is proposed, based on item response theory (IRT), to fit non-hierarchical test structure as well. The binary scores of a test of English were used for calculating the probabilities of answering each item correctly. The probability matrix was factor analyzed, and the difficulty intervals or estimates corresponding to the factors were related to the total scores by scaling both the intervals (also the factor loadings) and the scores to common metrics of the test scores. The new procedure is a type of a transformation of the binary scores into more informative probabilities that can be analyzed more accurately by many statistical techniques such as factor analysis. The resulting factor loadings are used to rank order the items according to the probability of their being answered correctly. This ranking procedure was compared, in terms of meaningful item grouping, to ranking by difficulties. It is recommended that the Linear Structural Equation Relations (LISREL) computer programs be used in future studies to relate the structures of several test forms for a given administration and across administrations when supplemented with demographic variables of the students and the teachers. BILOG's Bayesian procedures for estimating item and ability parameters are also recommended because of the importance of estimation accuracy to model fit. The proposed ranking procedure should always be compared with the item difficulties ranking procedure for meaningful item grouping. One figure and four tables illustrate the analyses. There is a 15-item list of references. (Author/SLD)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A