Kappa Calculator

Reliability is an important part of any research study.  The Statistics Solutions’ Kappa Calculator assesses the inter-rater reliability of two raters on a target.  In this simple-to-use calculator, you enter in the frequency of agreements and disagreements between the raters and the kappa calculator will calculate your kappa coefficient.  The calculator gives references to help you qualitatively assess the level of agreement. (Click here for example).

Click here to begin using the Kappa Calculator

The Kappa Calculator will open up in a separate window for you to use.

Sample Write-up

Thirty-four themes were identified.  All of the kappa coefficients were evaluated using the guideline outlined by Landis and Koch (1977), where the strength of the kappa coefficients =0.01-0.20 slight; 0.21-0.40 fair; 0.41-0.60 moderate; 0.61-0.80 substantial; 0.81-1.00 almost perfect, according to Landis & Koch (1977).  Of the thirty-four themes, 11 had fair agreement, five had moderate agreement, four had substantial agreement, and four themes had almost perfect agreement.


Landis, J. R., & Koch, G. G. (1977).  The measurement of observer agreement for categorical data.  Biometrics, 33, 159-174