ERIC Number: ED414291
Record Type: RIE
Publication Date: 1997-Mar-26
Reference Count: N/A
Comparing Dual-Language Versions of an International Computerized-Adaptive Certification Exam.
Sireci, Stephen G.; Foster, David F.; Robin, Frederic; Olsen, James
Evaluating the comparability of a test administered in different languages is a difficult, if not impossible, task. Comparisons are problematic because observed differences in test performance between groups who take different language versions of a test could be due to a difference in difficulty between the tests, to cultural differences in test taking behavior, or to a difference in proficiency between the language groups. The international certification testing programs conducted by Novell, Inc. are exceptional examples of the complex psychometric demands inherent in multiple language assessment programs. Novell's international certification program includes tests administered in 12 languages. Many of these tests are computerized adaptive (CAT), complicating comparisons across tests and languages. This paper reports the results of a study comparing English and German language versions of a high-stakes Novell CAT certification exam. The two versions of the test were compared by analyses including separate and concurrent item response theory calibrations. Results with 1,668 English-language candidates and 922 German-language candidates indicate that the English and German CATs are highly similar, and that the tests appear to be unidimensional in both the English and German versions. It is also concluded that the German candidate sample was more proficient than the English sample, and that 2 of 15 items functioned differentially across the languages. The source of the differential item functioning was identified post hoc using bilingual subject matter experts. The comparability of the passing scores, and other critical validity issues are discussed. (Contains 4 tables, 11 figures, and 20 references.) (Author/SLD)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Authoring Institution: N/A
Note: Paper presented at the Annual Meeting of the National Council on Measurement in Education (Chicago, IL, March 24-27, 1997).