NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 12 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Bridgeman, Brent – Journal of Educational Measurement, 2012
In an article in the Winter 2011 issue of the "Journal of Educational Measurement", van der Linden, Jeon, and Ferrara suggested that "test takers should trust their initial instincts and retain their initial responses when they have the opportunity to review test items." They presented a complex IRT model that appeared to show that students would…
Descriptors: Item Response Theory, Test Wiseness, Multiple Choice Tests, Scores
Peer reviewed Peer reviewed
Bridgeman, Brent; And Others – Journal of Educational Measurement, 1974
Significant differences in performance due to task definition were found only on verbal tasks with test instructions yielding superior performance. (Author/RC)
Descriptors: Elementary Education, Games, Intelligence Tests, Task Performance
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1980
Third grade students were identified as either fast/accurate or slow/accurate in response style on the Matching Familiar Figures Tests and Thurstone's Spatial Relations Test. The fast and slow students performed similarly on untimed tasks, as predicted. Contrary to expectations, they also scored similarly on timed tests. (Author/RD)
Descriptors: Ability Grouping, Cognitive Style, Cognitive Tests, Conceptual Tempo
Peer reviewed Peer reviewed
Bridgeman, Brent; Morgan, Rick; Wang, Ming-mei – Journal of Educational Measurement, 1997
Test results of 915 high school students taking a history examination with a choice of topics show that students were generally able to pick the topic on which they could get the highest score. Implications for fair scoring when topic choice is allowed are discussed. (SLD)
Descriptors: Essay Tests, High School Students, History, Performance Factors
Peer reviewed Peer reviewed
Bridgeman, Brent; Lewis, Charles – Journal of Educational Measurement, 1996
A reanalysis of the data considered by H. Wainer and L. Steinberg (1992) shows that a more appropriate composite indicator made up of Scholastic Aptitude Test mathematics score and high school grade point average demonstrates minuscule gender differences for both calculus and precalculus courses. (SLD)
Descriptors: College Entrance Examinations, College Freshmen, Females, Grade Point Average
Peer reviewed Peer reviewed
Gallagher, Ann; Bridgeman, Brent; Cahalan, Cara – Journal of Educational Measurement, 2002
Examined data from several national testing programs to determine whether the change from paper-based administration to computer-based tests influences group differences in performance. Results from four college and graduate entrance examinations and a professional licensing test show that African Americans and, to a lesser degree, Hispanics,…
Descriptors: Blacks, College Entrance Examinations, Computer Assisted Testing, Ethnic Groups
Peer reviewed Peer reviewed
Bridgeman, Brent – Journal of Educational Measurement, 1992
Examinees in a regular administration of the quantitative portion of the Graduate Record Examination responded to particular items in a machine-scannable multiple-choice format. Volunteers (n=364) used a computer to answer open-ended counterparts of these items. Scores for both formats demonstrated similar correlational patterns. (SLD)
Descriptors: Answer Sheets, College Entrance Examinations, College Students, Comparative Testing
Peer reviewed Peer reviewed
Bridgeman, Brent; Lewis, Charles – Journal of Educational Measurement, 1994
Examination of the correlation between multiple-choice and essay portions of the College Board Advanced Placement (AP) examinations with grades of first-year students from 32 colleges (largest sample=6,243) shows the best correlation for multiple choice tests for 2 examinations, with multiple choice and essay performing nearly equally for the…
Descriptors: Biology, College Freshmen, Correlation, English
Peer reviewed Peer reviewed
Bridgeman, Brent; Rock, Donald A. – Journal of Educational Measurement, 1993
Exploratory and confirmatory factor analyses were used to explore relationships among existing item types and three new computer-administered item types for the analytical scale of the Graduate Record Examination General Test. Results with 349 students indicate constructs the item types are measuring. (SLD)
Descriptors: College Entrance Examinations, College Students, Comparative Testing, Computer Assisted Testing
Peer reviewed Peer reviewed
Bridgeman, Brent; And Others – Journal of Educational Measurement, 1995
Half of a sample of 11,457 college-bound juniors used a calculator on Scholastic Aptitude Test mathematics questions, while half did not. Both genders and three ethnic groups benefited about equally from calculator use. Students who routinely used calculators were relatively advantaged, but effects on individual test items varied. (SLD)
Descriptors: Asian Americans, Blacks, Calculators, College Bound Students
Peer reviewed Peer reviewed
Direct linkDirect link
Bridgeman, Brent; Cline, Frederick – Journal of Educational Measurement, 2004
Time limits on some computer-adaptive tests (CATs) are such that many examinees have difficulty finishing, and some examinees may be administered tests with more time-consuming items than others. Results from over 100,000 examinees suggested that about half of the examinees must guess on the final six questions of the analytical section of the…
Descriptors: Guessing (Tests), Timed Tests, Adaptive Testing, Computer Assisted Testing
Peer reviewed Peer reviewed
Direct linkDirect link
Bridgeman, Brent; Trapani, Catherine; Curley, Edward – Journal of Educational Measurement, 2004
The impact of allowing more time for each question on the SAT I: Reasoning Test scores was estimated by embedding sections with a reduced number of questions into the standard 30-minute equating section of two national test administrations. Thus, for example, questions were deleted from a verbal section that contained 35 questions to produce forms…
Descriptors: College Entrance Examinations, Test Length, Scores