Advertisement

Addressing Social Bias in Information Retrieval

  • Jahna OtterbacherEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11018)

Abstract

Journalists and researchers alike have claimed that IR systems are socially biased, returning results to users that perpetuate gender and racial stereotypes. In this position paper, I argue that IR researchers and in particular, evaluation communities such as CLEF, can and should address such concerns. Using as a guide the Principles for Algorithmic Transparency and Accountability recently put forward by the Association for Computing Machinery, I provide examples of techniques for examining social biases in IR systems and in particular, search engines.

Keywords

Social biases Ranking algorithms Crowdsourcing 

References

  1. 1.
    Epstein, R., Robertson, R.E.: The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. Proc. Nat. Acad. Sci. 112(33), E4512–E4521 (2015).  https://doi.org/10.1073/pnas.1419828112. http://www.pnas.org/content/112/33/E4512
  2. 2.
    Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inf. Syst. (TOIS) 14(3), 330–347 (1996)CrossRefGoogle Scholar
  3. 3.
    Kay, M., Matuszek, C., Munson, S.A.: Unequal representation and gender stereotypes in image search results for occupations. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3819–3828. ACM (2015)Google Scholar
  4. 4.
    Otterbacher, J.: Crowdsourcing stereotypes: linguistic bias in metadata generated via gwap. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 1955–1964. ACM, New York (2015).  https://doi.org/10.1145/2702123.2702151
  5. 5.
    Otterbacher, J.: Social cues, social biases: stereotypes in annotations on people images. In: Proceedings of the Sixth AAAI Conference on Human Computation and Crowdsourcing (HCOMP-2018). AAAI Press, Palo Alto (2018)Google Scholar
  6. 6.
    Otterbacher, J., Bates, J., Clough, P.: Competent men and warm women: gender stereotypes and backlash in image search results. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI 2017, pp. 6620–6631. ACM, New York (2017).  https://doi.org/10.1145/3025453.3025727
  7. 7.
    Otterbacher, J., Checco, A., Demartini, G., Clough, P.: Investigating user perception of gender bias in image search: the role of sexism. In: Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR-2018). ACM Press, New York (2018)Google Scholar
  8. 8.
    Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., Granka, L.: In google we trust: users decisions on rank, position, and relevance. J. Comput. Med. Commun. 12(3), 801–823 (2007).  https://doi.org/10.1111/j.1083-6101.2007.00351.x
  9. 9.
    Von Ahn, L., Dabbish, L.: Labeling images with a computer game. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 319–326. ACM (2004)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Open University of CyprusNicosiaCyprus
  2. 2.Research Centre on Interactive Media Smart Systems and Emerging TechnologiesNicosiaCyprus

Personalised recommendations