Kappa coefficient is a statistic which measures inter-rater agreement for categorical items. It is generally thought to be a more robust measure than simple percent agreement calculation, since ? takes into account the agreement occurring by chance. Cohens kappa measures agreement between two raters only but Fleiss kappa is used when there are more than two raters. ? may have a value between -1 and +1. A value of kappa equal to +1 implies perfect agreement between the two raters, while that of -1 implies perfect disagreement. If kappa assumes the value 0, then this implies that there is no relationship between the ratings of the two observers, and any agreement or disagreement is due to chance alone.
Key words: observer, agreement, due to chance
|