NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
ERIC Number: ED286173
Record Type: Non-Journal
Publication Date: 1987-Aug
Pages: 22
Abstractor: N/A
ISBN: N/A
ISSN: N/A
EISSN: N/A
Alternative Methods for Calculating Intercoder Reliability in Content Analysis: Kappa, Weighted Kappa and Agreement Charts Procedures.
Kang, Namjun
If content analysis is to satisfy the requirement of objectivity, measures and procedures must be reliable. Reliability is usually measured by the proportion of agreement of all categories identically coded by different coders. For such data to be empirically meaningful, a high degree of inter-coder reliability must be demonstrated. Researchers in various disciplines such as psychiatry, biometrics, epidemiology, and medical diagnostics have employed different procedures for their own specific problems; however, researchers in communication studies have not used parallel developments in these areas. The most widely used intercoder reliability measures in communication studies--Scott's Pi and the "index of crude agreement"--are based on questionable marginal homogeneity assumptions. Without the homogeneity assumption, J.A. Cohen proposed kappa statistic, a coefficient similar to Scott's Pi. The use of kappa and weighted kappa is restricted to the cases where the number of coders is two, but later researchers extended the concept to more than two coders. S.I. Bangdiwala and Hope Bryan have developed an "agreement chart" method that can help investigators pinpoint the source of poor intercoder agreement, and in which the use of graphics aids in grasping and remembering relationships. An analysis of fictitious data further illustrates Bangdiwala and Bryan's methodology. Many of the reliability measures may be executed by using a hand calculator (for items with only three categories) or "canned" computer programs. (Charts and tables of data are included, and figures and references are attached.) (NKA)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A