NotesFAQContact Us
Search Tips
ERIC Number: ED328621
Record Type: RIE
Publication Date: 1990-Nov
Pages: 44
Abstractor: N/A
Reference Count: N/A
Applying Empirical Analyses to the Evaluation of Test Content.
Sireci, Stephen G.; And Others
Although some researchers have argued against use of the term "content validity," the ability of a test item to adequately represent the domain of knowledge tested continues to be an issue of paramount importance in test construction. The present paper reviews previous analyses of test content and proposes a new empirical method for evaluating the content representativeness of a test. The proposed empirical method evaluates the content of a test by determining if similarity ratings of expert judges reflect the content structure specified in the test blueprint. Three expert judges rated the similarity of items on a 30-item multiple-choice test of study skills. The test was designed to assess the knowledge acquired by students at the end of a five-session study skills course. The test blueprint specified six content areas: study habits, time management, classroom learning, textbook learning, preparing for taking examinations, and taking examinations. The similarity data were used in a multidimensional scaling procedure to determine the dimensionality of the data. A subsequent cluster analysis was performed to determine whether the item clusters corresponded to the arrangement of items in the test blueprint. The results indicate a strong correspondence between the similarity data and the arrangement of items in the original test blueprint. Advantages of using item similarity data as an alternative to item response data are provided. Five data tables and six figures are included. A 29-item list of references is provided. (Author/TJH)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers: Dimensional Analysis; Empirical Analysis; Experts; Similarity Ratings
Note: Paper presented at the Annual Meeting of the Northeastern Educational Research Association (Ellenville, NY, November 1, 1990).