NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing 1 to 15 of 38 results
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; von Davier, Matthias – International Journal of Testing, 2014
In this article, we investigate the creation of comparable score scales across countries in international assessments. We examine potential improvements to current score scale calibration procedures used in international large-scale assessments. Our approach seeks to improve fairness in scoring international large-scale assessments, which often…
Descriptors: Test Bias, Scores, International Programs, Educational Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, María Elena; Ercikan, Kadriye; Zumbo, Bruno D.; Lawless, René – International Journal of Testing, 2014
In this study, we contrast results from two differential item functioning (DIF) approaches (manifest and latent class) by the number of items and sources of items identified as DIF using data from an international reading assessment. The latter approach yielded three latent classes, presenting evidence of heterogeneity in examinee response…
Descriptors: Test Bias, Comparative Analysis, Reading Tests, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Fischer, Sebastian; Freund, Philipp Alexander – International Journal of Testing, 2014
The Adaption-Innovation Inventory (AII), originally developed by Kirton (1976), is a widely used self-report instrument for measuring problem-solving styles at work. The present study investigates how scores on the AII are affected by different response styles. Data are collected from a combined sample (N = 738) of students, employees, and…
Descriptors: Measures (Individuals), Scores, Item Response Theory, Response Style (Tests)
Peer reviewed Peer reviewed
Direct linkDirect link
Kan, Adnan; Bulut, Okan – International Journal of Testing, 2014
This study investigated whether the linguistic complexity of items leads to gender differential item functioning (DIF) on mathematics assessments. Two forms of a mathematics test were developed. The first form consisted of algebra items based on mathematical expressions, terms, and equations. In the second form, the same items were written as word…
Descriptors: Gender Differences, Test Bias, Difficulty Level, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Lee, HyeSun; Geisinger, Kurt F. – International Journal of Testing, 2014
Differential item functioning (DIF) analysis is important in terms of test fairness. While DIF analyses have mainly been conducted with manifest grouping variables, such as gender or race/ethnicity, it has been recently claimed that not only the grouping variables but also contextual variables pertaining to examinees should be considered in DIF…
Descriptors: Test Bias, Gender Differences, Regression (Statistics), Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Rios, Joseph A.; Sireci, Stephen G. – International Journal of Testing, 2014
The International Test Commission's "Guidelines for Translating and Adapting Tests" (2010) provide important guidance on developing and evaluating tests for use across languages. These guidelines are widely applauded, but the degree to which they are followed in practice is unknown. The objective of this study was to perform a…
Descriptors: Guidelines, Translation, Adaptive Testing, Second Languages
Peer reviewed Peer reviewed
Direct linkDirect link
Tay, Louis; Vermunt, Jeroen K.; Wang, Chun – International Journal of Testing, 2013
We evaluate the item response theory with covariates (IRT-C) procedure for assessing differential item functioning (DIF) without preknowledge of anchor items (Tay, Newman, & Vermunt, 2011). This procedure begins with a fully constrained baseline model, and candidate items are tested for uniform and/or nonuniform DIF using the Wald statistic.…
Descriptors: Item Response Theory, Test Bias, Models, Statistical Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; Ercikan, Kadriye; Zumbo, Bruno – International Journal of Testing, 2013
In this study, we investigated differential item functioning (DIF) and its sources using a latent class (LC) modeling approach. Potential sources of LC DIF related to instruction and teacher-related variables were investigated using substantive and three statistical approaches: descriptive discriminant function, multinomial logistic regression,…
Descriptors: Test Bias, Test Items, Multivariate Analysis, Discriminant Analysis
Peer reviewed Peer reviewed
Direct linkDirect link
Sandilands, Debra; Oliveri, Maria Elena; Zumbo, Bruno D.; Ercikan, Kadriye – International Journal of Testing, 2013
International large-scale assessments of achievement often have a large degree of differential item functioning (DIF) between countries, which can threaten score equivalence and reduce the validity of inferences based on comparisons of group performances. It is important to understand potential sources of DIF to improve the validity of future…
Descriptors: Validity, Measures (Individuals), International Studies, Foreign Countries
Peer reviewed Peer reviewed
Direct linkDirect link
Mucherah, Winnie; Finch, W. Holmes; Keaikitse, Setlhomo – International Journal of Testing, 2012
Understanding adolescent self-concept is of great concern for educators, mental health professionals, and parents, as research consistently demonstrates that low self-concept is related to a number of problem behaviors and poor outcomes. Thus, accurate measurements of self-concept are key, and the validity of such measurements, including the…
Descriptors: Test Bias, Mental Health Workers, Validity, Self Concept Measures
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; Olson, Brent F.; Ercikan, Kadriye; Zumbo, Bruno D. – International Journal of Testing, 2012
In this study, the Canadian English and French versions of the Problem-Solving Measure of the Programme for International Student Assessment 2003 were examined to investigate their degree of measurement comparability at the item- and test-levels. Three methods of differential item functioning (DIF) were compared: parametric and nonparametric item…
Descriptors: Foreign Students, Test Bias, Speech Communication, Effect Size
Peer reviewed Peer reviewed
Direct linkDirect link
Gattamorta, Karina A.; Penfield, Randall D.; Myers, Nicholas D. – International Journal of Testing, 2012
Measurement invariance is a common consideration in the evaluation of the validity and fairness of test scores when the tested population contains distinct groups of examinees, such as examinees receiving different forms of a translated test. Measurement invariance in polytomous items has traditionally been evaluated at the item-level,…
Descriptors: Foreign Countries, Psychometrics, Test Bias, Test Items
Peer reviewed Peer reviewed
Direct linkDirect link
Puhan, Gautam – International Journal of Testing, 2011
This study examined the effect of including or excluding repeaters on the equating process and results. New forms of two tests were equated to their respective old forms using either all examinees or only the first timer examinees in the new form sample. Results showed that for both tests used in this study, including or excluding repeaters in the…
Descriptors: Equated Scores, Educational Testing, Student Evaluation, Sample Size
Peer reviewed Peer reviewed
Direct linkDirect link
Ong, Yoke Mooi; Williams, Julian Scott; Lamprianou, Iasonas – International Journal of Testing, 2011
The aims of this study are (a) to examine the sources of differential functioning by gender via differential bundle functioning (DBF) in mathematics assessment and (b) to use DBF to explore whether the differential functioning displayed is construct-relevant or construct-irrelevant. Three qualitatively different areas, namely curriculum domains,…
Descriptors: Test Bias, Gender Differences, Gender Bias, Mathematics Tests
Peer reviewed Peer reviewed
Direct linkDirect link
Magis, David; Raiche, Gilles; Beland, Sebastien; Gerard, Paul – International Journal of Testing, 2011
We present an extension of the logistic regression procedure to identify dichotomous differential item functioning (DIF) in the presence of more than two groups of respondents. Starting from the usual framework of a single focal group, we propose a general approach to estimate the item response functions in each group and to test for the presence…
Descriptors: Language Skills, Identification, Foreign Countries, Evaluation Methods
Previous Page | Next Page »
Pages: 1  |  2  |  3