On the reliability of unitizing textual continua: Further developments

An Erratum to this article was published on 19 December 2015

Abstract

This paper builds on an agreement coefficient proposed by Krippendorff (Content analysis: an introduction to its methodology, 2013) for measuring the reliability of unitizing and coding continuous phenomena, for example, of texts, videos, or sound recordings. It serves three purposes: It modifies Krippendorff’s definition which turned out to behave not as expected when applied to more than two observers, coders, or annotators. It extends this reliability measure to a family of four coefficients able to assess the reliabilities of diverse properties of unitized continua. It adds a way to obtain the confidence intervals of these coefficients as well as the probability of failing to reach targeted reliability levels. And it describes and provides access to free software that calculates all values of this family of reliability coefficients.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

References

  1. Cappella, J.N., Turow, J., Jamieson, K.: Call-in political talk radio: background, content, audiences, portrayal in mainstream media (Report series no. 5). University of Pennsylvania, Annenberg Public Policy Center, Philadelphia (1996)

  2. Cohen, J.: A coefficient of agreement for nominal scales. Educ. Psychol. Meas. 20, 37–46 (1960)

    Article  Google Scholar 

  3. Cohen, J.: Weighted kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychol. Bull. 70(4), 213–220 (1968)

    Article  Google Scholar 

  4. von Eye, A., Mun, E.Y.: Analyzing Rater Agreement: Manifest Variable Methods. Lawrence Erlbaum Associates, Mahwah (2006)

    Google Scholar 

  5. Fleiss, J.L.: Measuring nominal scale agreement among many raters. Psychol. Bull. 76(5), 378–382 (1971)

    Article  Google Scholar 

  6. Guetzkow, H.: Unitizing and categorizing problems in coding qualitative data. J. Clin. Psychol. 6, 47–58 (1950)

    Article  Google Scholar 

  7. Krippendorff, K.: Content Analysis: An Introduction to Its Methodology 3rd Edition. Sage Publications, Thousand Oaks, CA. Replacement of Section 12.4 to be introduced into its 2nd printing. http://www.asc.upenn.edu/usr/krippendorff/U-alpha.pdf (2013). Accessed 14 May 2015

  8. Krippendorff, K.: Agreement and information in the reliability of coding. Commun. Meas. Methods 5(2), 93–112. http://repository.upenn.edu/asc_papers/278 (2011). Accessed 14 May 2015

  9. Krippendorff, K.: On the reliability of unitizing continuous data. In: Marsden, P.V. (ed.) Sociological Methodology, vol. 25, pp. 47–76. Blackwell, Cambridge (1995)

    Google Scholar 

  10. Krippendorff, K.: Content Analysis: An Introduction to its Methodology. Sage, Beverly Hills (1980)

    Google Scholar 

  11. Krippendorff, K.: Bivariate agreement coefficients for reliability of data, chapter 8. In: Borgatta, E.R., Bohrnstedt, G.W. (eds.) Sociological Methodology, vol. 2, pp. 139–150. Jossey-Bass, Inc., San Francisco (1970)

    Google Scholar 

  12. Mathet, Y., Widlöcher, A., Fort, K., Francois, C., Galibert, O., Grouin, C., Kahn, J., Rosset, S., Zweigenbaum, P.: Manual corpus annotation: giving meaning to the evaluation metrics. In: Proceedings of COLING 2012. ACL, Mumbai. https://aclweb.org/anthology/C/C12/C12-2079.pdf (2012). Accessed 10 May 2015

  13. Scott, W.A.: Reliability of content analysis: the case of nominal scale coding. Public Opin. Quart. 19, 321–325 (1955)

    Article  Google Scholar 

  14. Widlöcher, A., Mathet, Y.: The glozz platform: a corpus annotation and mining tool. In: Concolato, C., Schmitz, P. (eds.). DocEng’12, Sept. 4–7, 2012, ACM Symposium on Document Engineering, ACM, Paris (2012), , pp 171–180. http://dl.acm.org/citation.cfm?doid=2361354.2361394. Accessed 10 May 2015

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Klaus Krippendorff.

Additional information

An erratum to this article can be found at http://dx.doi.org/10.1007/s11135-015-0289-7.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Krippendorff, K., Mathet, Y., Bouvry, S. et al. On the reliability of unitizing textual continua: Further developments. Qual Quant 50, 2347–2364 (2016). https://doi.org/10.1007/s11135-015-0266-1

Download citation

Keywords

  • Krippendorff’s alpha
  • Agreement
  • Reliability
  • Segmentation
  • Unitizing
  • Coding
  • Annotating