NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1081199
Record Type: Journal
Publication Date: 2015-Dec
Pages: 23
Abstractor: As Provided
Reference Count: 11
ISBN: N/A
ISSN: ISSN-0013-1644
A Ratio Test of Interrater Agreement with High Specificity
Cousineau, Denis; Laurencelle, Louis
Educational and Psychological Measurement, v75 n6 p979-1001 Dec 2015
Existing tests of interrater agreements have high statistical power; however, they lack specificity. If the ratings of the two raters do not show agreement but are not random, the current tests, some of which are based on Cohen's kappa, will often reject the null hypothesis, leading to the wrong conclusion that agreement is present. A new test of interrater agreement, applicable to nominal or ordinal categories, is presented. The test statistic can be expressed as a ratio (labeled Q[subscript A], ranging from 0 to infinity) or as a proportion (labeled P[subscript A], ranging from 0 to 1). This test weighs information supporting agreement with information supporting disagreement. This new test's effectiveness (power and specificity) is compared with five other tests of interrater agreement in a series of Monte Carlo simulations. The new test, although slightly less powerful than the other tests reviewed, is the only one sensitive to agreement only. We also introduce confidence intervals on the proportion of agreement.
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://sagepub.com
Publication Type: Journal Articles; Reports - Evaluative
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A