Skip to main content

Selecting Success Criteria: Experiences with an Academic Library Catalogue

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8138))

Abstract

Multiple methods exist for evaluating search systems, ranging from more user-oriented approaches to those more focused on evaluating system performance. When preparing an evaluation, key questions include: (i) why conduct the evaluation, (ii) what should be evaluated, and (iii) how the evaluation should be conducted. Over recent years there has been more focus on the end users of search systems and understanding what they view as ‘success’. In this paper we consider what to evaluate; in particular what criteria users of search systems consider most important and whether this varies by user characteristic. Using our experience with evaluating an academic library catalogue, input was gathered from end users relating to the perceived importance of different evaluation criteria prior to conducting an evaluation. We analyse results to show which criteria users most value, together with the inter-relationships between them. Our results highlight the necessity of conducting multiple forms of evaluation to ensure that search systems are deemed successful by their users.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Saracevic, T.: Evaluation of evaluation in information retrieval. In: Fox, E., Ingwersen, P., Fidel, R. (eds.) Proc. 18th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Seattle, Washington, USA, July 9-13, pp. 138–146. ACM Press, New York (1995)

    Google Scholar 

  2. Harman, D.: Information retrieval evaluation. Synthesis Lectures on Information Concepts, Retrieval, and Services, vol. 3(2). Morgan & Claypool Publishers, San Raphael (2011)

    Google Scholar 

  3. Robertson, S.E., Hancock-Beaulieu, M.: On the evaluation of information retrieval systems. Information Processing and Management 28(4), 457–466 (1992)

    Article  Google Scholar 

  4. Robertson, S.: On the history of evaluation in IR. Journal of Information Science 34(4), 439–456 (2008)

    Article  Google Scholar 

  5. Voorhees, E.M., Harman, D.K.: TREC: experiments and evaluation in information retrieval. MIT Press, Cambridge (2005)

    Google Scholar 

  6. Ingwersen, P., Järvelin, K.: The turn: integration of information seeking and retrieval in context. Springer, New York (2005)

    Google Scholar 

  7. Borlund, P.: User-Centred Evaluation of Information Retrieval Systems. In: Göker, A., Davies, J. (eds.) Information Retrieval: Searching in the 21st Century. John Wiley & Sons, Chichester (2009)

    Google Scholar 

  8. Kelly, D.: Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval 3(1-2), 1–224 (2009)

    Google Scholar 

  9. van Rijsbergen, C.J.: Information retrieval, 2nd edn. Butterworths, London (1979)

    Google Scholar 

  10. Saracevic, T.: Digital Library Evaluation: Toward Evolution of Concepts. Library Trends 49(2), 350–369 (2000)

    Google Scholar 

  11. Fuhr, N., et al.: Evaluation of Digital Libraries. International Journal on Digital Libraries 8(1), 21–38 (2007)

    Article  MathSciNet  Google Scholar 

  12. Tsakonas, G., Papatheodorou, C.: Exploring Usefulness and Usability in the Evaluation of Open Access Digital Libraries. Information Processing & Management 44(3), 1234–1250 (2008)

    Article  Google Scholar 

  13. Buchanan, S., Salako, A.: Evaluating the Usability and Usefulness of a Digital Library. Library Review 58(9), 638–651 (2009)

    Article  Google Scholar 

  14. Nielsen, J.: Enhancing the Explanatory Power of Usability Heuristics. In: Proc. SIGCHI Conference on Human Factors in Computing Systems, pp. 152–158. ACM Press, New York (1994)

    Google Scholar 

  15. Toms, E.G., O’Brien, H.L., Kopak, R., Freund, L.: Searching for relevance in the relevance of search. In: Crestani, F., Ruthven, I. (eds.) CoLIS 2005. LNCS, vol. 3507, pp. 59–78. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  16. Al-Maskari, A., Sanderson, M.: A Review of Factors Influencing User Satisfaction in Information Retrieval. Journal of the American Society for Information Science and Technology 61(5), 859–868 (2010)

    Article  Google Scholar 

  17. Xie, H.: Users’ Evaluation of Digital Libraries (Dls): Their Uses, Their Criteria, and Their Assessment. Information Processing & Management 44(3), 1346–1373 (2008)

    Article  Google Scholar 

  18. Xie, H.: Evaluation of Digital Libraries: Criteria and Problems from Users’ Perspectives. Library and Information Science Research 28(3), 433–452 (2006)

    Article  Google Scholar 

  19. Kelly, D., et al.: Evaluation Challenges and Directions for Information-Seeking Support Systems. Computer 42(3), 60–66 (2009)

    Article  Google Scholar 

  20. Marchionini, G.: Information seeking in electronic environments. Cambridge University Press, Cambridge (1995)

    Book  Google Scholar 

  21. Hölscher, C., Strube, G.: Web Search Behavior of Internet Experts and Newbies. Computer Networks 33(1), 337–346 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Clough, P., Goodale, P. (2013). Selecting Success Criteria: Experiences with an Academic Library Catalogue. In: Forner, P., Müller, H., Paredes, R., Rosso, P., Stein, B. (eds) Information Access Evaluation. Multilinguality, Multimodality, and Visualization. CLEF 2013. Lecture Notes in Computer Science, vol 8138. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40802-1_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-40802-1_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-40801-4

  • Online ISBN: 978-3-642-40802-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics