PDF) Guidelines of the minimum sample size requirements for Cohen's Kappa
Inter-Coder Agreement in One-to-Many Classification: Fuzzy Kappa | PLOS ONE
Measuring inter-rater reliability for nominal data – which coefficients and confidence intervals are appropriate? | BMC Medical Research Methodology | Full Text
Cohen's kappa - Wikipedia
Why Cohen's Kappa should be avoided as performance measure in classification | PLOS ONE
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Measuring inter-rater reliability for nominal data – which coefficients and confidence intervals are appropriate? | BMC Medical Research Methodology | Full Text
Measuring Inter-coder Agreement - ATLAS.ti
Fleiss' Kappa | Real Statistics Using Excel
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Agree or Disagree? A Demonstration of An Alternative Statistic to Cohen's Kappa for Measuring the Extent and Reliability of Ag
Symmetry | Free Full-Text | An Empirical Comparative Assessment of Inter-Rater Agreement of Binary Outcomes and Multiple Raters
Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa. | Semantic Scholar
Cohen's Kappa | Real Statistics Using Excel
Cohen's Kappa: Learn It, Use It, Judge It | KNIME
Multilevel classification, Cohen kappa and Krippendorff alpha - deepsense.ai
Measuring Inter-coder Agreement - ATLAS.ti
Analysis of inter-coder agreement - ATLAS.ti Help in English
Fleiss' Kappa | Real Statistics Using Excel
Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa. | Semantic Scholar
Cohen's Kappa | Real Statistics Using Excel
PDF) Measuring inter-rater reliability for nominal data - Which coefficients and confidence intervals are appropriate?
Why Cohen's Kappa should be avoided as performance measure in classification | PLOS ONE