Publication Date
In 2024 | 0 |
Since 2023 | 0 |
Since 2020 (last 5 years) | 0 |
Since 2015 (last 10 years) | 1 |
Since 2005 (last 20 years) | 3 |
Descriptor
College Entrance Examinations | 3 |
Item Response Theory | 3 |
Statistical Analysis | 3 |
Difficulty Level | 2 |
Test Bias | 2 |
Test Items | 2 |
Accuracy | 1 |
African American Students | 1 |
Correlation | 1 |
Effect Size | 1 |
Guessing (Tests) | 1 |
More ▼ |
Source
Educational and Psychological… | 3 |
Author
Huang, Qiming | 1 |
Santelices, Maria Veronica | 1 |
Skorupski, William P. | 1 |
Tay, Louis | 1 |
Vermunt, Jeroen K. | 1 |
Wilson, Mark | 1 |
Wolkowitz, Amanda A. | 1 |
Publication Type
Journal Articles | 3 |
Reports - Research | 3 |
Education Level
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
California | 1 |
Laws, Policies, & Programs
Assessments and Surveys
SAT (College Admission Test) | 3 |
What Works Clearinghouse Rating
Tay, Louis; Huang, Qiming; Vermunt, Jeroen K. – Educational and Psychological Measurement, 2016
In large-scale testing, the use of multigroup approaches is limited for assessing differential item functioning (DIF) across multiple variables as DIF is examined for each variable separately. In contrast, the item response theory with covariate (IRT-C) procedure can be used to examine DIF across multiple variables (covariates) simultaneously. To…
Descriptors: Item Response Theory, Test Bias, Simulation, College Entrance Examinations
Wolkowitz, Amanda A.; Skorupski, William P. – Educational and Psychological Measurement, 2013
When missing values are present in item response data, there are a number of ways one might impute a correct or incorrect response to a multiple-choice item. There are significantly fewer methods for imputing the actual response option an examinee may have provided if he or she had not omitted the item either purposely or accidentally. This…
Descriptors: Multiple Choice Tests, Statistical Analysis, Models, Accuracy
Santelices, Maria Veronica; Wilson, Mark – Educational and Psychological Measurement, 2012
The relationship between differential item functioning (DIF) and item difficulty on the SAT is such that more difficult items tended to exhibit DIF in favor of the focal group (usually minority groups). These results were reported by Kulick and Hu, and Freedle and have been enthusiastically discussed by more recent literature. Examining the…
Descriptors: Test Bias, Test Items, Difficulty Level, Statistical Analysis