![Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters](https://www.mdpi.com/symmetry/symmetry-14-00262/article_deploy/html/images/symmetry-14-00262-g002b.png)
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
GitHub - aaronnorby/cohens-kappa-JS: Javascript module for computing Cohen's kappa and Cohen's weighted kappa.
![Generalized Cohen's Kappa: A Novel Inter-rater Reliability Metric for Non-mutually Exclusive Categories | SpringerLink Generalized Cohen's Kappa: A Novel Inter-rater Reliability Metric for Non-mutually Exclusive Categories | SpringerLink](https://media.springernature.com/lw685/springer-static/image/chp%3A10.1007%2F978-3-031-35132-7_2/MediaObjects/544623_1_En_2_Fig6_HTML.png)
Generalized Cohen's Kappa: A Novel Inter-rater Reliability Metric for Non-mutually Exclusive Categories | SpringerLink
Cohen's Kappa: What it is, when to use it, and how to avoid its pitfalls | by Rosaria Silipo | Towards Data Science
![PDF] Cohen's quadratically weighted kappa is higher than linearly weighted kappa for tridiagonal agreement tables | Semantic Scholar PDF] Cohen's quadratically weighted kappa is higher than linearly weighted kappa for tridiagonal agreement tables | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/5df092de279231383db41edabbc6e93624b302b6/2-Table1-1.png)