Skip to main content

How We Code

  • Conference paper
  • First Online:
Advances in Quantitative Ethnography (ICQE 2021)

Abstract

Coding data—defining concepts and identifying where they occur in data—is a critical aspect of qualitative data analysis, and especially so in quantitative ethnography. Coding is a central process for creating meaning from data, and while much has been written about coding methods and theory, relatively little has been written about what constitutes best practices for fair and valid coding, what justifies those practices, and how to implement them. In this paper, our goal is not to address these issues comprehensively, but to provide guidelines for good coding practice and to highlight some of the issues and key questions that quantitative ethnographers and other researchers should consider when coding data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Because we conceive of this contribution as an overview of key issues in coding theory and practice for QE researchers, both novice and advanced, and due to the limitations of space, we do not include a comprehensive review of the literature on coding and qualitative discourse analysis. We will address this shortcoming in a future, expanded version of this paper.

  2. 2.

    This example is drawn from the work of Briggs [1], but see also Shaffer [20].

  3. 3.

    We do not ask: Are the Codes fair? There is no absolute or objective sense in which Codes can be fair or not. But we can ask what evidence we have to support a claim that they are.

  4. 4.

    The ρ statistic can be applied to any measure of IRR with raters using binary codes. There are other statistics that can be used for raters when using non-binary codes.

References

  1. Briggs, J.L.: Emotions have many faces: inuit lessons. Anthropologica 42(2), 157–164 (2000)

    Google Scholar 

  2. Charmaz, K.: Constructing Grounded Theory: A Practical Guide through Qualitative Analysis. Sage, Thousand Oaks (2006)

    Google Scholar 

  3. Chouldechova, A., Roth, A.: The frontiers of fairness in machine learning. arXiv:1810.08810 (2018)

  4. Cohen, J.: Kappa: coefficient of concordance. Educ. Psych. Meas. 20, 37 (1960)

    Article  Google Scholar 

  5. Eagan, B., et al.: The binary replicate test: determining the sensitivity of CSCL models to coding error. In: International Conference on Computer Supported Collaborative Learning (2019)

    Google Scholar 

  6. Eagan, B.R. et al.: Can we rely on reliability? Testing the assumptions of inter-rater reliability. In: Smith, B.K. et al. (eds.) Making a difference: Prioritizing equity and access in CSCL: 12th International Conference on Computer-Supported Collaborative Learning. pp. 529–532 (2017)

    Google Scholar 

  7. Eagan, B.R., et al.: rhoR: Rho for inter rater reliability (2016)

    Google Scholar 

  8. Gee, J.P.: An Introduction to Discourse Analysis: Theory and Method. Routledge, London (1999)

    Google Scholar 

  9. Glesne, C.: Becoming Qualitative Researchers: An Introduction. Pearson, Boston (2015)

    Google Scholar 

  10. Goodman, N.: Ways of Worldmaking. Hackett, Indianapolis (1978)

    Google Scholar 

  11. Herrenkohl, L.R., Cornelius, L.: Investigating elementary students’ scientific and historical argumentation. J. Learn. Sci. 22(3), 413–461 (2013)

    Article  Google Scholar 

  12. Hutchby, I., Wooffitt, R.: Conversation analysis. Polity (2008)

    Google Scholar 

  13. Kaufman, S., et al.: Leakage in data mining: formulation, detection, and avoidance. ACM Trans. Knowl. Discov. Data (TKDD) 6(4), 1–21 (2012)

    Article  Google Scholar 

  14. Kurasaki, K.S.: Intercoder reliability for validating conclusions drawn from open-ended interview data. Field Methods 12(3), 179–194 (2000)

    Article  Google Scholar 

  15. Lombard, M., et al.: Content analysis in mass communication: assessment and reporting of intercoder reliability. Hum. Commun. Res. 28(4), 587–604 (2002)

    Article  Google Scholar 

  16. Lukács, G., Ansorge, U.: Information leakage in the response time-based concealed information test. Appl. Cogn. Psychol. 33(6), 1178–1196 (2019)

    Article  Google Scholar 

  17. Marquart, C.L. et al.: ncodeR: techniques for automated classifiers [R package] (2018)

    Google Scholar 

  18. Mehrabi, N., et al.: A survey on bias and fairness in machine learning. arXiv:1908.09635 (2019)

  19. Saldaña, J.: The Coding Manual for Qualitative Researchers. SAGE Publications, Thousand Oaks (2015)

    Google Scholar 

  20. Shaffer, D.W.: Quantitative Ethnography. Cathcart Press, Madison (2017)

    Google Scholar 

  21. Shaffer, D.W., et al.: The nCoder: A Technique for Improving the Utility of Inter-Rater Reliability Statistics. Epistemic Games Group, Madison (2015)

    Google Scholar 

  22. Siebert-Evenstone, A.: Personal communication (n.d.)

    Google Scholar 

  23. Thornberg, R., Charmaz, K.: Grounded theory and theoretical coding. In: Flick, U. (ed.) The SAGE Handbook of Qualitative Data Analysis, pp. 153–169 SAGE Publications, London (2014)

    Google Scholar 

Download references

Acknowledgements

This work was funded in part by the National Science Foundation (DRL-1661036, DRL-1713110), the Wisconsin Alumni Research Foundation, and the Office of the Vice Chancellor for Research and Graduate Education at the University of Wisconsin-Madison. The opinions, findings, and conclusions do not reflect the views of the funding agencies, cooperating institutions, or other individuals.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Williamson Shaffer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shaffer, D.W., Ruis, A.R. (2021). How We Code. In: Ruis, A.R., Lee, S.B. (eds) Advances in Quantitative Ethnography. ICQE 2021. Communications in Computer and Information Science, vol 1312. Springer, Cham. https://doi.org/10.1007/978-3-030-67788-6_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-67788-6_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-67787-9

  • Online ISBN: 978-3-030-67788-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics