NotesFAQContact Us
Search Tips
ERIC Number: ED363646
Record Type: Non-Journal
Publication Date: 1993-Apr
Pages: 66
Abstractor: N/A
Reference Count: N/A
Using Subject Matter Experts To Assess Content Representation: A MDS Analysis.
Sireci, Stephen G.; Geisinger, Kurt
Various methods used to assess the content of a test are reviewed, and a new procedure designed to improve on these methods is presented. The two tests considered are a professional licensure examination, the auditing section of the Uniform Certified Public Accountant Examination, and an educational achievement test, a nationally standardized social studies achievement test. Previous methods have generally been empirical, using factor analysis or multidimensional scaling (MDS) to analyze the inter-item correlation matrix derived from examinee responses, or subjective, using the data provided by subject matter experts (SMEs) to determine whether items represent content areas that the test purports to measure. A method has previously been proposed that uses MDS to discover dimensions obtained from the analysis of ratings by SMEs of the similarity of items comprising a test. This study expanded that method by using 2 groups of SMEs (15 for each test) to evaluate the content of the 2 tests studied. Correlation and cluster analyses results suggest that the content structure of a test can be evaluated adequately by analyzing item similarity data provided by SMEs. Results further suggest that the MDS procedure should be used to supplement analyses of item relevance data rather than replace them. Six figures and 18 tables present analysis findings. (Contains 23 references.) (SLD)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: American Council on Education, Washington, DC. GED Testing Service.