NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ916007
Record Type: Journal
Publication Date: 2011
Pages: 14
Abstractor: As Provided
Reference Count: 23
ISSN: ISSN-1530-5058
Evaluating the Bookmark Standard Setting Method: The Impact of Random Item Ordering
Davis-Becker, Susan L.; Buckendahl, Chad W.; Gerrow, Jack
International Journal of Testing, v11 n1 p24-37 2011
Throughout the world, cut scores are an important aspect of a high-stakes testing program because they are a key operational component of the interpretation of test scores. One method for setting standards that is prevalent in educational testing programs--the Bookmark method--is intended to be a less cognitively complex alternative to methods such as the modified Angoff (1971) approach. In this study, we explored that assertion for a licensure examination program where two independent panels applied the Bookmark method to recommend a cut score on its Written Exam. One panel initially made their ratings using an ordered item booklet (OIB) in which items were randomly ordered with respect to empirically estimated difficulty followed by judgments on a correctly ordered OIB. A second panel applied the Bookmark process with only the correctly ordered OIB. Results revealed striking similarities among judgments, calling into question panelists' ability to appropriately engage in the Bookmark method. In addition, under the random-ordering condition, approximately one-third of the panelists placed their bookmarks in a manner inconsistent with the new item difficulties. Implications of these results for the Bookmark standard setting method are also discussed. (Contains 2 tables and 1 figure.)
Routledge. Available from: Taylor & Francis, Ltd. 325 Chestnut Street Suite 800, Philadelphia, PA 19106. Tel: 800-354-1420; Fax: 215-625-2940; Web site:
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A