Skip to main content

User Effect in Evaluating Personalized Information Retrieval Systems

  • Conference paper
Innovative Approaches for Learning and Knowledge Sharing (EC-TEL 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4227))

Included in the following conference series:

Abstract

Evaluation of personalized information retrieval (IR) systems is challenged by the user effect, which is manifested in terms of users’ inconsistency in relevance judgment, ranking and relevance criteria usage. Two empirical studies on evaluating a personalized search engine were performed. Two types of relative measures computed with different mathematical formulae were compared. The ranking similarity and the randomness of relevance criteria usage were estimated. Results show some undesirable personalization effects. Implications for the future development and research of adaptive IR systems are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barry, C.L., Schamber, L.: Users criteria for relevance evaluation: A cross-situational comparison. IP&M 34(2/3), 219–236 (1998)

    Google Scholar 

  2. Bateman, J.: Changes in relevance criteria: A longitudinal study. In: Proceedings of the 61st Annual meeting of the American Society for Information Science, vol. 35, pp. 23–32 (1998)

    Google Scholar 

  3. Borlund, P.: The concept of relevance in IR. JASIST 54(10), 913–925 (2003)

    Article  Google Scholar 

  4. Borlund, P., Ingwersen, P.: Measurse of relative relevance and ranked half-life: performance indicators for interactive IR. In: Proceedings of SIGIR 1998, pp. 324–331 (1998)

    Google Scholar 

  5. Boyce, B.: Beyond topicality: A two stage view of relevance and retrieval process. IP&M 18, 105–109 (1982)

    Google Scholar 

  6. Brantner, S. (ed.): D4.4 Smart learning space description: the ELENA project, http://www.elena-project.org/en/index.asp?p=3-4

  7. Buckley, C., Voorhees, E.M.: Retrieval evaluation with incomplete information. In: SIGIR 2004, Sheffield, UK (2004)

    Google Scholar 

  8. Cosjin, E., Ingwersen, P.: Dimensions of relevance. IP&M 36, 533–550 (2000)

    Google Scholar 

  9. Crystal, A., Greenberg, J.: Relevance criteria identified by health information users during Web searches. JASIST (in press)

    Google Scholar 

  10. Elliot, M., Kling, R.: Organizational usability of digital libraries. JASIST 48(11), 1023–1035 (1997)

    Article  Google Scholar 

  11. Ellis, D., Furner-Hines, J., Willett, P.: On the measurement of inter-linker consistency and retrieval effectiveness in hypertext databases. In: Proc. 17th SIGIR Conference, Dublin (Ireland), pp. 51–60 (1994)

    Google Scholar 

  12. Hassenzahl, M., Tractinsky, N.: User experience – a research agenda. Behaviour and Information Technology 25(2), 91–97 (2006)

    Article  Google Scholar 

  13. Hertzum, M., Jacobsen, N.E.: The evaluator effect: A chilling fact about usability evaluation methods. Int’l. J. Human Computer Interaction 13(4), 421–443 (2001)

    Article  Google Scholar 

  14. Jansen, B.J.: Seeking and implementing automated assistance during the search process. IP&M 41, 909–928 (2005)

    MathSciNet  Google Scholar 

  15. Maglaughlin, K.L., Sonnenwald, D.H.: User perspectives on relevance critera: A comparison among relevant, partially relevant, and not-relevant judgments. JASIST 53(5), 327–342 (2002)

    Article  Google Scholar 

  16. Magoulas, G.D., Chen, S.Y.C. (eds.): Proceedings of the Workshop in Individual Difference in Adaptive Hypermedia (2004), http://www.dcs.bbk.ac.uk/~gmagoulas/AH200_Workshop/Individual_Differences_WorkProc.pdf

  17. McCarthy, J., Wright, P.C.: Technology as Experience. MIT Press, Cambridge (2004)

    Google Scholar 

  18. Michel, C.: Poset representation and similarity comparisons of systems in IR. In: SIGIR Workshop on <Mathematical/Formal Methods in IR> (MF/IR 2003), Toronto, Canada (July 2003)

    Google Scholar 

  19. Mizzaro, S.: Relevance: the whole history. JASIST 48, 810–832 (1997)

    Article  Google Scholar 

  20. Park, T.K.: Toward a theory of user-based relevance. JASIST 45, 135–141 (1994)

    Article  Google Scholar 

  21. Saracevic, T.: Relevance reconsidered 1996. In: Ingwersen, P., Ole Pors, N. (eds.) CoLIS2, Copenhagen, Denmark, pp. 201–218 (1996)

    Google Scholar 

  22. Sloane, N.J.A., Wyner, A.D. (eds.): Claude Elwood Shannon: Collected Papers. IEEE Press, New York (1993)

    Google Scholar 

  23. Suchman, L.: Plans and situated actions. Cambridge University Press, Cambridge (1987)

    Google Scholar 

  24. Swanson, D.R.: Subjective versus objective relevance in bibliographic retrieval systems. Library Quarterly 56, 389–398 (1986)

    Article  Google Scholar 

  25. Tang, R., Solomon, P.: Use of relevance criteria across stages of document evaluation. JASIST 52(8), 676–685 (2001)

    Article  Google Scholar 

  26. Voorhees, E.M.: Evaluation by highly relevant documents. In: Proceedings of the 24th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 74–82 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Law, E.LC., Klobučar, T., Pipan, M. (2006). User Effect in Evaluating Personalized Information Retrieval Systems. In: Nejdl, W., Tochtermann, K. (eds) Innovative Approaches for Learning and Knowledge Sharing. EC-TEL 2006. Lecture Notes in Computer Science, vol 4227. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11876663_21

Download citation

  • DOI: https://doi.org/10.1007/11876663_21

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-45777-0

  • Online ISBN: 978-3-540-46234-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics