NotesFAQContact Us
Search Tips
ERIC Number: ED407421
Record Type: Non-Journal
Publication Date: 1996-Oct
Pages: 3
Abstractor: N/A
Reference Count: N/A
Examining a Coding Scheme for a Peer Tutoring Study: Agreement, Reliability, or Both?
Love, Angela; And Others
The development of a coding scheme to identify the function of each conversational turn within episodes of conflict in a peer tutoring setting is described, and the scheme, based on Cohen's kappa analysis, is presented. Although 15 codes were developed for the initial effort, 7 codes were finally used to reflect each utterance as: (1) agreement; (2) disagreement; (3) fact; (4) request for information; (5) directive; (6) assertion of solution; and (7) transact. Cohen's kappa, which is a point-by-point analysis of agreement between coders that corrects for change agreement, was used with each dyad. A reliability study was then conducted to evaluate the reliability of each measure, generalizing across coders. Some codes had high reliability; others did not. Combining the study of agreement and reliability was useful in developing the coding scheme. Using Cohen's kappa helped researchers respond to the internal pressure of understanding the measures. Cronbach's intraclass correlation coefficient (Cronbach's alpha) helped researchers respond to the external pressure of conveying to others the accuracy (reliability) of the measures. (SLD)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A