![PDF] Explaining the unsuitability of the kappa coefficient in the assessment and comparison of the accuracy of thematic maps obtained by image classification | Semantic Scholar PDF] Explaining the unsuitability of the kappa coefficient in the assessment and comparison of the accuracy of thematic maps obtained by image classification | Semantic Scholar](https://d3i71xaburhd42.cloudfront.net/31589e2ee1daaf23a836cfbfe61ec52e1f249075/12-Table1-1.png)
PDF] Explaining the unsuitability of the kappa coefficient in the assessment and comparison of the accuracy of thematic maps obtained by image classification | Semantic Scholar
![Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science](https://miro.medium.com/max/1248/0*Dox3BxITAQPyUSAY.png)
Multi-Class Metrics Made Simple, Part III: the Kappa Score (aka Cohen's Kappa Coefficient) | by Boaz Shmueli | Towards Data Science
![Comparison of Cohen's Kappa and Gwet's AC1 with a mass shooting classification index: A study of rater uncertainty Comparison of Cohen's Kappa and Gwet's AC1 with a mass shooting classification index: A study of rater uncertainty](https://shareok.org/bitstream/handle/11244/325442/Keener_okstate_0664D_16644.pdf.jpg?sequence=3&isAllowed=y)
Comparison of Cohen's Kappa and Gwet's AC1 with a mass shooting classification index: A study of rater uncertainty
Accuracy versus Kappa for different Classification Models to Predict Wine Quality | Azure AI Gallery
![Appraisal of kappa-based metrics and disagreement indices of accuracy assessment for parametric and nonparametric techniques used in LULC classification and change detection | SpringerLink Appraisal of kappa-based metrics and disagreement indices of accuracy assessment for parametric and nonparametric techniques used in LULC classification and change detection | SpringerLink](https://media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs40808-020-00740-x/MediaObjects/40808_2020_740_Fig2_HTML.png)
Appraisal of kappa-based metrics and disagreement indices of accuracy assessment for parametric and nonparametric techniques used in LULC classification and change detection | SpringerLink
![Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - Data Science Vidhya Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - Data Science Vidhya](http://www.datasciencevidhya.com/wp/wp-content/uploads/2022/02/Metrics-to-evaluate-classification-models-with-R-codes-Confusion-Matrix-Sensitivity-Specificity-Cohens-Kappa-Value-Mcnemars-Test.png)
Metrics to evaluate classification models with R codes: Confusion Matrix, Sensitivity, Specificity, Cohen's Kappa Value, Mcnemar's Test - Data Science Vidhya
![Cohen's Kappa and Classification Table Metrics 2.0: An ArcView 3x Extension for Accuracy Assessment of Spatially Explicit Models: USGS Open-File Report 2005-1363: Amazon.co.uk: Jenness, Jeff, Wynne, J. Judson, U.S. Department of the Cohen's Kappa and Classification Table Metrics 2.0: An ArcView 3x Extension for Accuracy Assessment of Spatially Explicit Models: USGS Open-File Report 2005-1363: Amazon.co.uk: Jenness, Jeff, Wynne, J. Judson, U.S. Department of the](https://images-na.ssl-images-amazon.com/images/I/51+yXStysrL._SX382_BO1,204,203,200_.jpg)