NotesFAQContact Us
Search Tips
ERIC Number: ED190597
Record Type: RIE
Publication Date: 1979-Apr
Pages: 17
Abstractor: N/A
Reference Count: 0
Bayesian and Empirical Bayes Approaches to Setting Passing Scores on Mastery Tests. Publication Series in Mastery Testing.
Huynh, Huynh; Saunders, Joseph C., III
The Bayesian approach to setting passing scores, as proposed by Swaminathan, Hambleton, and Algina, is compared with the empirical Bayes approach to the same problem that is derived from Huynh's decision-theoretic framework. Comparisons are based on simulated data which follow an approximate beta-binomial distribution and on real test results from the Comprehensive Tests of Basic Skills administered in the South Carolina Statewide Testing Program. Both procedures lead to setting identical or almost identical passing scores as long as the test score distribution is reasonably symmetric or when the minimum mastery level or criterion level is high. Larger discrepancies tend to occur when this level is low, especially when the distribution of test scores is concentrated at a few extreme scores or when the frequencies are irregular. However, in terms of mastery/nonmastery decision, the two procedures result in the same classifications in practically all situations. The empirical Bayes procedures may be used for tests of any length, while the Bayesian procedure is recommended only for tests of eight or more items. Further, the empirical Bayes can be generalized and applied to more complex testing situations with less difficulty than the Bayesian procedure. (Author/CP)
Publication Type: Reports - Research; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: National Inst. of Education (DHEW), Washington, DC.
Authoring Institution: South Carolina Univ., Columbia. School of Education.
Identifiers: Binomial Error Model; Comprehensive Tests of Basic Skills; South Carolina Statewide Testing Program
Note: Paper presented at the joint Annual Meetings of the American Educational Research Association and the National Council on Measurement in Education (San Francisco, CA, April 8-12, 1979).