KNOWLEDGEBASE - ARTICLE #1126

How can I quantify agreement between two tests or observers using kappa?

Use this free web GraphPad QuickCalc.

It computes Kappa using equations from Fleiss, Statistical methods for rates and proportions, third edition. 

Converting a number to an adjective is arbitrary, but we use the scheme from Altman:

K Strength of agreement
0.0 Worse than chance alone
< 0.20 Poor
0.21 - 0.40 Fair
0.41 - 0.60 Moderate
0.61 - 0.80 Good
0.81 - 0.99 Very good
1.00 Perfect

Explore the Knowledgebase

Analyze, graph and present your scientific work easily with GraphPad Prism. No coding required.