Publication Date
| In 2015 | 0 |
| Since 2014 | 1 |
| Since 2011 (last 5 years) | 2 |
| Since 2006 (last 10 years) | 3 |
| Since 1996 (last 20 years) | 5 |
Descriptor
| Test Bias | 3 |
| Foreign Countries | 2 |
| Item Bias | 2 |
| Mathematics Tests | 2 |
| Second Languages | 2 |
| Test Format | 2 |
| Adaptive Testing | 1 |
| Attitude Measures | 1 |
| Classification | 1 |
| Comparative Analysis | 1 |
| More ▼ | |
Source
| International Journal of… | 5 |
Author
| Sireci, Stephen G. | 5 |
| Bhola, Dennison | 1 |
| Chulu, Bob Wajizigha | 1 |
| Hambleton, Ronald K. | 1 |
| Harter, James | 1 |
| Hauger, Jeffrey B. | 1 |
| Rios, Joseph A. | 1 |
| Robin, Frederic | 1 |
| Yang, Yongwei | 1 |
Publication Type
| Journal Articles | 5 |
| Reports - Evaluative | 2 |
| Reports - Research | 2 |
| Information Analyses | 1 |
| Reports - Descriptive | 1 |
| Speeches/Meeting Papers | 1 |
Education Level
| Elementary Education | 1 |
| Grade 8 | 1 |
Audience
Showing all 5 results
Rios, Joseph A.; Sireci, Stephen G. – International Journal of Testing, 2014
The International Test Commission's "Guidelines for Translating and Adapting Tests" (2010) provide important guidance on developing and evaluating tests for use across languages. These guidelines are widely applauded, but the degree to which they are followed in practice is unknown. The objective of this study was to perform a…
Descriptors: Guidelines, Translation, Adaptive Testing, Second Languages
Chulu, Bob Wajizigha; Sireci, Stephen G. – International Journal of Testing, 2011
Many examination agencies, policy makers, media houses, and the public at large make high-stakes decisions based on test scores. Unfortunately, in some cases educational tests are not statistically equated to account for test differences over time, which leads to inappropriate interpretations of students' performance. In this study we illustrate…
Descriptors: Classification, Foreign Countries, Item Response Theory, High Stakes Tests
Hauger, Jeffrey B.; Sireci, Stephen G. – International Journal of Testing, 2008
In this study, we examined the presence of differential item functioning (DIF) among groups of students who were tested in their native language or in a different language when participating in the 1999 Trends in International Mathematics and Science Study. Data from 18,837 examinees from three countries (Singapore, United States, and Iran) were…
Descriptors: Test Bias, Language Dominance, Second Languages, Language Proficiency
Peer reviewedRobin, Frederic; Sireci, Stephen G.; Hambleton, Ronald K. – International Journal of Testing, 2003
Illustrates how multidimensional scaling (MDS) and differential item functioning (DIF) procedures can be used to evaluate the equivalence of different language versions of an examination. Presents examples of structural differences and DIF across languages. (SLD)
Descriptors: Item Bias, Licensing Examinations (Professions), Multidimensional Scaling, Multilingual Materials
Peer reviewedSireci, Stephen G.; Harter, James; Yang, Yongwei; Bhola, Dennison – International Journal of Testing, 2003
Evaluated the structural equivalence and differential item functioning of an employee attitude survey from a large international corporation across three languages, eight cultures, and two mediums of administration. Results for 40,595 employees show the structure of survey data was consistent and items functioned similarly across all groups. (SLD)
Descriptors: Attitude Measures, Computer Assisted Testing, Cross Cultural Studies, Employees

Direct link
