NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 7 results Save | Export
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Moses, Tim; Liu, Jinghua; Tan, Adele; Deng, Weiling; Dorans, Neil J. – ETS Research Report Series, 2013
In this study, differential item functioning (DIF) methods utilizing 14 different matching variables were applied to assess DIF in the constructed-response (CR) items from 6 forms of 3 mixed-format tests. Results suggested that the methods might produce distinct patterns of DIF results for different tests and testing programs, in that the DIF…
Descriptors: Test Construction, Multiple Choice Tests, Test Items, Item Analysis
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Guo, Hongwen; Liu, Jinghua; Dorans, Neil; Feigenbaum, Miriam – ETS Research Report Series, 2011
Maintaining score stability is crucial for an ongoing testing program that administers several tests per year over many years. One way to stall the drift of the score scale is to use an equating design with multiple links. In this study, we use the operational and experimental SAT® data collected from 44 administrations to investigate the effect…
Descriptors: Equated Scores, College Entrance Examinations, Reliability, Testing Programs
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Liu, Jinghua; Zhu, Xiaowen – ETS Research Report Series, 2008
The purpose of this paper is to explore methods to approximate population invariance without conducting multiple linkings for subpopulations. Under the single group or equivalent groups design, no linking needs to be performed for the parallel-linear system linking functions. The unequated raw score information can be used as an approximation. For…
Descriptors: Raw Scores, Test Format, Comparative Analysis, Test Construction
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Haberman, Shelby J.; Guo, Hongwen; Liu, Jinghua; Dorans, Neil J. – ETS Research Report Series, 2008
This study uses historical data to explore the consistency of SAT® I: Reasoning Test score conversions and to examine trends in scaled score means. During the period from April 1995 to December 2003, both Verbal (V) and Math (M) means display substantial seasonality, and a slight increasing trend for both is observed. SAT Math means increase more…
Descriptors: College Entrance Examinations, Thinking Skills, Logical Thinking, Scaling
Liu, Jinghua; Feigenbaum, Miriam; Dorans, Neil J. – College Board, 2005
Score equity assessment was used to evaluate linkings of new SAT® to the current SAT Reasoning Test™. Population invariance across gender groups was studied on the linkage of a new SAT critical reading prototype to a current SAT verbal section, and on the linkage of a new SAT math prototype to a current SAT math section. The results indicated that…
Descriptors: Gender Differences, Research Reports, Cognitive Tests, College Entrance Examinations
Liu, Jinghua; Allspach, Jill R.; Feigenbaum, Miriam; Oh, Hyeon-Joo; Burton, Nancy – College Entrance Examination Board, 2004
This study evaluated whether the addition of a writing section to the SAT Reasoning Test™ (referred to as the SAT® in this study) would impact test-taker performance because of fatigue caused by increased test length. The study also investigated test-takers' subjective feelings of fatigue. Ninety-seven test-takers were randomly assigned to three…
Descriptors: College Entrance Examinations, Writing Skills, Fatigue (Biology), Influences
Liu, Jinghua; Feigenbaum, Miriam; Cook, Linda – College Entrance Examination Board, 2004
This study explored possible configurations of the new SAT® critical reading section without analogy items. The item pool contained items from SAT verbal (SAT-V) sections of 14 previously administered SAT tests, calibrated using the three-parameter logistic IRT model. Multiple versions of several prototypes that do not contain analogy items were…
Descriptors: College Entrance Examinations, Critical Reading, Logical Thinking, Difficulty Level