NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing 1 to 15 of 2,930 results
Ho, Ya-Ting – Online Submission, 2014
There is a continuing increase in the African American and Hispanic student populations in public schools. The students who are invited to gifted programs are overwhelmingly White. This is the situation in schools in the United States and also in Taiwan. Misunderstanding or unawareness of culture difference among educators might contribute to…
Descriptors: Foreign Countries, Cross Cultural Studies, Interviews, Academically Gifted
Peer reviewed Peer reviewed
Direct linkDirect link
Oliveri, Maria Elena; von Davier, Matthias – International Journal of Testing, 2014
In this article, we investigate the creation of comparable score scales across countries in international assessments. We examine potential improvements to current score scale calibration procedures used in international large-scale assessments. Our approach seeks to improve fairness in scoring international large-scale assessments, which often…
Descriptors: Test Bias, Scores, International Programs, Educational Assessment
Peer reviewed Peer reviewed
Direct linkDirect link
He, Wei; Reckase, Mark D. – Educational and Psychological Measurement, 2014
For computerized adaptive tests (CATs) to work well, they must have an item pool with sufficient numbers of good quality items. Many researchers have pointed out that, in developing item pools for CATs, not only is the item pool size important but also the distribution of item parameters and practical considerations such as content distribution…
Descriptors: Item Banks, Test Length, Computer Assisted Testing, Adaptive Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Chen, Ying-Fang; Jiao, Hong – Educational Assessment, 2014
Differential item functioning (DIF) may be caused by an interaction of multiple manifest grouping variables or unexplored manifest variables, which cannot be detected by conventional DIF detection methods that are based on a single manifest grouping variable. Such DIF may be detected by a latent approach using the mixture item response theory…
Descriptors: Test Bias, Item Response Theory, Reading Tests, Student Surveys
Peer reviewed Peer reviewed
Direct linkDirect link
Baylor, Carolyn; McAuliffe, Megan J.; Hughes, Louise E.; Yorkston, Kathryn; Anderson, Tim; Jiseon, Kim; Amtmann, Dagmar – Journal of Speech, Language, and Hearing Research, 2014
Purpose: To examine the cross-cultural applicability of the Communicative Participation Item Bank (CPIB) through a comparison of respondents with Parkinson's disease (PD) from the United States and New Zealand. Method: A total of 428 respondents--218 from the United States and 210 from New Zealand-completed the self-report CPIB and a series…
Descriptors: Foreign Countries, Test Bias, Item Banks, Neurological Impairments
Peer reviewed Peer reviewed
Direct linkDirect link
Schatschneider, Christopher; Lane, Kathleen Lynne; Oakes, Wendy Peia; Kalberg, Jemma Robertson – Educational Assessment, 2014
Screening of students at risk for antisocial behaviors in school is an essential step in the implementation of evidence-based supports for academic, behavioral, and social domains at the first sign of concern. This study examined the measurement properties of a free-access systematic behavior screening tool: the Student Risk Screening Scale…
Descriptors: Test Bias, Screening Tests, Antisocial Behavior, At Risk Students
Peer reviewed Peer reviewed
Direct linkDirect link
Taylor, Cora M.; Vehorn, Alison; Noble, Hylan; Weitlauf, Amy S.; Warren, Zachary E. – Journal of Autism and Developmental Disorders, 2014
The goal of the current study was to develop and pilot the utility of two simple internal response bias metrics, over-reporting and under-reporting, in terms of additive clinical value within common screening practices for early detection of autism spectrum disorder risk. Participants were caregivers and children under 36 months of age (n = 145)…
Descriptors: Pilot Projects, Autism, Pervasive Developmental Disorders, Caregivers
Peer reviewed Peer reviewed
Direct linkDirect link
Bennink, Margot; Croon, Marcel A.; Keuning, Jos; Vermunt, Jeroen K. – Journal of Educational and Behavioral Statistics, 2014
In educational measurement, responses of students on items are used not only to measure the ability of students, but also to evaluate and compare the performance of schools. Analysis should ideally account for the multilevel structure of the data, and school-level processes not related to ability, such as working climate and administration…
Descriptors: Academic Ability, Educational Assessment, Educational Testing, Test Bias
Peer reviewed Peer reviewed
Direct linkDirect link
Benítez, Isabel; Padilla, José-Luis – Journal of Mixed Methods Research, 2014
Differential item functioning (DIF) can undermine the validity of cross-lingual comparisons. While a lot of efficient statistics for detecting DIF are available, few general findings have been found to explain DIF results. The objective of the article was to study DIF sources by using a mixed method design. The design involves a quantitative phase…
Descriptors: Foreign Countries, Mixed Methods Research, Test Bias, Cross Cultural Studies
Peer reviewed Peer reviewed
Direct linkDirect link
Longford, Nicholas T. – Journal of Educational and Behavioral Statistics, 2014
A method for medical screening is adapted to differential item functioning (DIF). Its essential elements are explicit declarations of the level of DIF that is acceptable and of the loss function that quantifies the consequences of the two kinds of inappropriate classification of an item. Instead of a single level and a single function, sets of…
Descriptors: Test Items, Test Bias, Simulation, Hypothesis Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Koo, Jin; Becker, Betsy Jane; Kim, Young-Suk – Language Testing, 2014
In this study, differential item functioning (DIF) trends were examined for English language learners (ELLs) versus non-ELL students in third and tenth grades on a large-scale reading assessment. To facilitate the analyses, a meta-analytic DIF technique was employed. The results revealed that items requiring knowledge of words and phrases in…
Descriptors: Test Bias, Reading Tests, English Language Learners, Native Speakers
Peer reviewed Peer reviewed
Direct linkDirect link
Huhta, Ari; Alanen, Riikka; Tarnanen, Mirja; Martin, Maisa; Hirvelä, Tuija – Language Testing, 2014
There is still relatively little research on how well the CEFR and similar holistic scales work when they are used to rate L2 texts. Using both multifaceted Rasch analyses and qualitative data from rater comments and interviews, the ratings obtained by using a CEFR-based writing scale and the Finnish National Core Curriculum scale for L2 writing…
Descriptors: Foreign Countries, Writing Skills, Second Language Learning, Finno Ugric Languages
Peer reviewed Peer reviewed
Direct linkDirect link
Hou, Likun; de la Torre, Jimmy; Nandakumar, Ratna – Journal of Educational Measurement, 2014
Analyzing examinees' responses using cognitive diagnostic models (CDMs) has the advantage of providing diagnostic information. To ensure the validity of the results from these models, differential item functioning (DIF) in CDMs needs to be investigated. In this article, the Wald test is proposed to examine DIF in the context of CDMs. This…
Descriptors: Test Bias, Models, Simulation, Error Patterns
Peer reviewed Peer reviewed
Direct linkDirect link
Wells, Craig S.; Hambleton, Ronald K.; Kirkpatrick, Robert; Meng, Yu – Applied Measurement in Education, 2014
The purpose of the present study was to develop and evaluate two procedures flagging consequential item parameter drift (IPD) in an operational testing program. The first procedure was based on flagging items that exhibit a meaningful magnitude of IPD using a critical value that was defined to represent barely tolerable IPD. The second procedure…
Descriptors: Test Items, Test Bias, Equated Scores, Item Response Theory
Peer reviewed Peer reviewed
Direct linkDirect link
Beinicke, Andrea; Pässler, Katja; Hell, Benedikt – International Journal for Educational and Vocational Guidance, 2014
The study investigates consequences of eliminating items showing gender-specific differential item functioning (DIF) on the psychometric structure of a standard RIASEC interest inventory. Holland's hexagonal model was tested for structural invariance using a confirmatory methodological approach (confirmatory factor analysis and randomization…
Descriptors: Test Bias, Gender Differences, Vocational Interests, Interest Inventories
Previous Page | Next Page »
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  ...  |  196