Skip to main content
Log in

Estimation of symmetric disagreement using a uniform association model for ordinal agreement data

  • Original Paper
  • Published:
AStA Advances in Statistical Analysis Aims and scope Submit manuscript

Abstract

The Cohen kappa is probably the most widely used measure of agreement. Measuring the degree of agreement or disagreement in square contingency tables by two raters is mostly of interest. Modeling the agreement provides more information on the pattern of the agreement rather than summarizing the agreement by kappa coefficient. Additionally, the disagreement models in the literature they mentioned are proposed for the nominal scales. Disagreement and uniform association models are aggregated as a new model for the ordinal scale agreement data, thus in this paper, symmetric disagreement plus uniform association model that aims separating the association from the disagreement is proposed. Proposed model is applied to real uterine cancer data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Agresti, A.: A model for agreement between ratings on an ordinal scale. Biometrics 44, 539–548 (1988)

    Article  MATH  Google Scholar 

  • Agresti, A.: Modeling ordered categorical data: recent advances and future challenges. Stat. Med. 18, 2191–2207 (1999)

    Article  Google Scholar 

  • Agresti, A.: Categorical Data Analysis. Wiley, New York (2002)

    Book  MATH  Google Scholar 

  • Cohen, J.: A coefficient of agreement for nominal scales. Educ. Psychol. Meas. 20, 37–46 (1960)

    Article  Google Scholar 

  • Cohen, J.: Weighted kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychol. Bull. 70, 213–220 (1968)

    Article  Google Scholar 

  • Goodman, L.A.: Simple models for the analysis of association in cross-classifications having ordered categories. J. Am. Stat. Assoc. 74, 537–553 (1979)

    Article  Google Scholar 

  • Landis, J.R., Koch, G.G.: The measurement of observer agreement for categorical data. Biometrics 33, 159–174 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  • Osius, G.: Log-linear models for association and agreement in stratified square contingency tables. Comput. Stat. 12, 311–328 (1997)

    MATH  Google Scholar 

  • Tanner, M.A., Young, M.A.: Modeling agreement among raters. J. Am. Stat. Assoc. 80, 175–180 (1985a)

    Article  Google Scholar 

  • Tanner, M.A., Young, M.A.: Modeling ordinal scale disagreement. Psychol. Bull. 98(2), 408–415 (1985b)

    Article  Google Scholar 

  • von Eye, A.: An alternative to Cohen’s κ. Eur. Psychol. 11(1), 12–24 (2006)

    Article  Google Scholar 

  • von Eye, A., Von Eye, M.: Can one use Cohen’s kappa to examine disagreement? Methodology 1(4), 129–142 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Serpil Aktaş.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Aktaş, S., Saraçbaşı, T. Estimation of symmetric disagreement using a uniform association model for ordinal agreement data. AStA Adv Stat Anal 93, 335–343 (2009). https://doi.org/10.1007/s10182-008-0083-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10182-008-0083-0

Keywords

Navigation