NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing all 7 results
Peer reviewed Peer reviewed
Direct linkDirect link
Engelhard, George, Jr.; Fincher, Melissa; Domaleski, Christopher S. – Applied Measurement in Education, 2011
This study examines the effects of two test administration accommodations on the mathematics performance of students within the context of a large-scale statewide assessment. The two test administration accommodations were resource guides and calculators. A stratified random sample of schools was selected to represent the demographic…
Descriptors: Testing Accommodations, Disabilities, High Stakes Tests, Program Effectiveness
Peer reviewed Peer reviewed
Direct linkDirect link
Randall, Jennifer; Engelhard, George, Jr. – Applied Measurement in Education, 2010
The psychometric properties and multigroup measurement invariance of scores across subgroups, items, and persons on the "Reading for Meaning" items from the Georgia Criterion Referenced Competency Test (CRCT) were assessed in a sample of 778 seventh-grade students. Specifically, we sought to determine the extent to which score-based inferences on…
Descriptors: Testing Accommodations, Test Items, Learning Disabilities, Factor Analysis
Peer reviewed Peer reviewed
Engelhard, George, Jr.; Davis, Melodee; Hansche, Linda – Applied Measurement in Education, 1999
Examined whether reviewers on item-review committees can identify accurately test items that exhibit a variety of flaws. Results with 39 reviewers of a 75-item test show that reviewers exhibit fairly high accuracy rates overall, with statistically significant differences in judgmental accuracy among reviewers. (SLD)
Descriptors: Decision Making, Judges, Review (Reexamination), Test Construction
Peer reviewed Peer reviewed
Garner, Mary; Engelhard, George, Jr. – Applied Measurement in Education, 1999
Gender differences in performance on multiple-choice and constructed-response items in mathematics were studied with 3592 11th graders taking a high school graduation examination. Results suggest that gender differences in mathematics may be linked to content and item format, thus supporting the usefulness of a many-faceted Rasch model for…
Descriptors: Constructed Response, Grade 11, Graduation Requirements, High School Students
Peer reviewed Peer reviewed
Engelhard, George, Jr.; Anderson, David W. – Applied Measurement in Education, 1998
A new approach for examining the quality of judgments from standard-setting judges using a Binomial Trials Model (BTM) is presented and illustrated with 26 judges from the Georgia High School Graduation Test. Results suggest that the BTM provides information not available from other methods. (SLD)
Descriptors: Graduation Requirements, High Schools, Judges, Standard Setting (Scoring)
Peer reviewed Peer reviewed
Engelhard, George, Jr. – Applied Measurement in Education, 1992
A Many-Faceted Rasch Model (FACETS) for measurement of writing ability is described, and its use in solving measurement problems in large-scale assessment is illustrated with a random sample of 1,000 students from Georgia's Eighth Grade Writing Test. It is a promising approach to assessment through written compositions. (SLD)
Descriptors: Educational Assessment, Essays, Evaluation Problems, Grade 8
Peer reviewed Peer reviewed
Engelhard, George, Jr.; And Others – Applied Measurement in Education, 1990
Whether judges on bias review committees can identify test items that function differently for blacks and whites was studied using items from teacher certification tests with 42 judges from 3 committees. Results indicate that agreement between judges and empirical indices are not more than would be expected by chance. (SLD)
Descriptors: Blacks, Identification, Item Bias, Judges