NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 11 results Save | Export
Reshetar, Rosemary – College Board, 2012
On 9/13/12, the Workshop on Developing Assessments to Meet the Goals of the 2012 Framework for K-12 Science Education was held at the National Academies of Science. The workshop was organized and led by the NRC Committee on Developing Assessments of Science Proficiency in K-12 (co-chaired by James Pellegrino and Mark Wilson) and targeted to state…
Descriptors: Advanced Placement, Biology, Science Tests, Science Education
College Board, 2012
Looking beyond the right or wrong answer is imperative to the development of effective educational environments conducive to Pre-AP work in math. This presentation explores a system of evaluation in math that provides a personalized, student-reflective model correlated to consortia-based assessment. Using examples of students' work that includes…
Descriptors: Student Evaluation, Mathematics Instruction, Correlation, Educational Assessment
Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary – College Board, 2012
The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…
Descriptors: Advanced Placement Programs, Achievement Tests, Item Response Theory, Models
Kaliski, Pamela; France, Megan; Huff, Kristen; Thurber, Allison – College Board, 2011
Developing a cognitive model of task performance is an important and often overlooked phase in assessment design; failing to establish such a model can threaten the validity of the inferences made from the scores produced by an assessment (e.g., Leighton, 2004). Conducting think aloud interviews (TAIs), where students think aloud while completing…
Descriptors: World History, Advanced Placement Programs, Achievement Tests, Protocol Analysis
Kaliski, Pamela; Huff, Kristen; Barry, Carol – College Board, 2011
For educational achievement tests that employ multiple-choice (MC) items and aim to reliably classify students into performance categories, it is critical to design MC items that are capable of discriminating student performance according to the stated achievement levels. This is accomplished, in part, by clearly understanding how item design…
Descriptors: Alignment (Education), Academic Achievement, Expertise, Evaluative Thinking
Kobrin, Jennifer L.; Kim, Rachel; Sackett, Paul – College Board, 2011
There is much debate on the merits and pitfalls of standardized tests for college admission, with questions regarding the format (multiple-choice versus constructed response), cognitive complexity, and content of these assessments (achievement versus aptitude) at the forefront of the discussion. This study addressed these questions by…
Descriptors: College Entrance Examinations, Mathematics Tests, Test Items, Predictive Validity
Antal, Judit; Melican, Gerald; Proctor, Thomas; Wiley, Andrew – College Board, 2010
Presented at the Annual Meeting of National Council on Measurement in Education (NCME) in 2010. The focus of the research is to investigate the effect of applying the Sinharay & Holland (2007) midi-test idea for building anchor tests to an on-going testing program with a series of versions of the test and comparing these results to the more…
Descriptors: Test Items, Equated Scores, Test Construction, Simulation
Wiley, Andrew – College Board, 2009
Presented at the national conference for the American Educational Research Association (AERA) in 2009. This discussed the development and implementation of the new SAT writing section.
Descriptors: Aptitude Tests, Writing Tests, Test Construction, Test Format
Plake, Barbara S.; Huff, Kristen; Reshetar, Rosemary – College Board, 2009
[Slides] presented at the Annual Meeting of National Council on Measurement in Education (NCME) in San Diego, CA in April 2009. This presentation discusses a methodology for directly connecting evidence-centered assessment design (ECD) to score interpretation and use through the development of Achievement level descriptors.
Descriptors: Achievement, Classification, Evidence, Test Construction
Hendrickson, Amy; Huff, Kristen; Luecht, Ric – College Board, 2009
[Slides] presented at the Annual Meeting of National Council on Measurement in Education (NCME) in San Diego, CA in April 2009. This presentation describes how the vehicles for gathering student evidence--task models and test specifications--are developed.
Descriptors: Test Items, Test Construction, Evidence, Achievement
Hendrickson, Amy; Patterson, Brian; Melican, Gerald – College Board, 2008
Presented at the Annual National Council on Measurement in Education (NCME) in New York in March 2008. This presentation explores how different item weighting can affect the effective weights, validity coefficents and test reliability of composite scores among test takers.
Descriptors: Multiple Choice Tests, Test Format, Test Validity, Test Reliability