NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ983290
Record Type: Journal
Publication Date: 2012-Apr
Pages: 26
Abstractor: As Provided
Reference Count: 18
Evaluating the Content Validity of Multistage-Adaptive Tests
Crotts, Katrina; Sireci, Stephen G.; Zenisky, April
Journal of Applied Testing Technology, v13 n1 Apr 2012
Validity evidence based on test content is important for educational tests to demonstrate the degree to which they fulfill their purposes. Most content validity studies involve subject matter experts (SMEs) who rate items that comprise a test form. In computerized-adaptive testing, examinees take different sets of items and test "forms" do not exist, which makes it difficult to evaluate the content validity of different tests taken by different examinees. In this study, we evaluated content validity of a multistage-adaptive test (MST) using SMEs' content validity ratings of all items in the MST bank. Analyses of these ratings across the most common "paths" taken by examinees were conducted. The results indicated the content validity ratings across the different tests taken by examinees were roughly equivalent. The method used illustrates how content validity can be evaluated in an MST context. (Contains 5 tables, 3 footnotes and 1 figure.)
Association of Test Publishers. 601 Pennsylvania Avenue NW, South Building Suite 900, Washington DC 20004. Tel: 866-240-7909; Fax: 717-755-8962; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: Adult Basic Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Massachusetts