Advertisement

Overview of the CLEF 2016 Social Book Search Lab

  • Marijn Koolen
  • Toine Bogers
  • Maria Gäde
  • Mark Hall
  • Iris Hendrickx
  • Hugo Huurdeman
  • Jaap Kamps
  • Mette Skov
  • Suzan Verberne
  • David Walsh
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9822)

Abstract

The Social Book Search (SBS) Lab investigates book search in scenarios where users search with more than just a query, and look for more than objective metadata. Real-world information needs are generally complex, yet almost all research focuses instead on either relatively simple search based on queries, or on profile-based recommendation. The goal is to research and develop techniques to support users in complex book search tasks. The SBS Lab has three tracks. The aim of the Suggestion Track is to develop test collections for evaluating ranking effectiveness of book retrieval and recommender systems. The aim of the Interactive Track is to develop user interfaces that support users through each stage during complex search tasks and to investigate how users exploit professional metadata and user-generated content. The Mining Track focuses on detecting and linking book titles in online book discussion forums, as well as detecting book search research in forum posts for automatic book recommendation.

Keywords

User Profile Book Title Book Record Word Embedding Forum Post 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Beckers, T., Fuhr, N., Pharo, N., Nordlie, R., Fachry, K.N.: Overview and results of the INEX 2009 interactive track. In: Lalmas, M., Jose, J., Rauber, A., Sebastiani, F., Frommholz, I. (eds.) ECDL 2010. LNCS, vol. 6273, pp. 409–412. Springer, Heidelberg (2010). ISBN 978-3-642-15463-8CrossRefGoogle Scholar
  2. 2.
    Borlund, P., Ingwersen, P.: The development of a method for the evaluation of interactive information retrieval systems. J. Documentation 53(3), 225–250 (1997)CrossRefGoogle Scholar
  3. 3.
    Elastic. Elasticsearch (2016). https://www.elastic.co/products/elasticsearch
  4. 4.
    Hall, M.M., Toms, E.: Building a common framework for IIR evaluation. In: Forner, P., Müller, H., Paredes, R., Rosso, P., Stein, B. (eds.) CLEF 2013. LNCS, vol. 8138, pp. 17–28. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40802-1_3 Google Scholar
  5. 5.
    Hall, M.M., Fernando, S., Clough, P., Soroa, A., Agirre, E., Stevenson, M.: Evaluating hierarchical organisation structures for exploring digital libraries. Inf. Retrieval 17(4), 351–379 (2014). doi: 10.1007/s10791-014-9242-y CrossRefGoogle Scholar
  6. 6.
    Hall, M.M., Huurdeman, H.C., Koolen, M., Skov, M., Walsh, D.: Overview of the INEX 2014 interactive social book search track. In: Cappellato, L., Ferro, N., Halvey, M., Kraaij, W. (eds.) Working Notes for CLEF 2014 Conference. CEUR Workshop Proceedings, Sheffield, UK, 15–18 September, 2014, vol. 1180, pp. 480–493. CEUR-WS.org (2014). http://ceur-ws.org/Vol-1180/CLEF2014wn-Inex-HallEt2014.pdf
  7. 7.
    Koolen, M., Kamps, J., Kazai, G.: Social book search: the impact of professional and user-generated content on book suggestions. In: Proceedings of the International Conference on Information and Knowledge Management (CIKM 2012). ACM (2012)Google Scholar
  8. 8.
    Koolen, M., et al.: Overview of the CLEF 2015 social book search lab. In: Mothe, J., et al. (eds.) CLEF 2015. LNCS, vol. 9283, pp. 545–564. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-24027-5_51 CrossRefGoogle Scholar
  9. 9.
    Koolen, M., Bogers, T., Gäde, M., Hall, M., Huurdeman, H., Kamps, J., Skov, M., Toms, E., Walsh, D.: Overview of the CLEF 2015 social book search lab. In: Mothe, J., et al. (eds.) CLEF 2015. LNCS, vol. 9283, pp. 545–564. Springer, Heidelberg (2015)Google Scholar
  10. 10.
    Koolen, M., Bogers, T., van den Bosch, A., Kamps, J.: Looking for books in social media: an analysis of complex search requests. In: Hanbury, A., Kazai, G., Rauber, A., Fuhr, N. (eds.) ECIR 2015. LNCS, vol. 9022, pp. 184–196. Springer, Heidelberg (2015)Google Scholar
  11. 11.
    Kuhlthau, C.C.: Inside the search process: information seeking from the user’s perspective. J. Am. Soc. Inf. Sci. 42(5), 361–371 (1991). ISSN 1097–4571, doi: 10.1002/(SICI)1097-4571(199106)42:5<361::AID-ASI6>3.0.CO;2-#
  12. 12.
    O’Brien, H.L., Toms, E.G.: The development and evaluation of a survey to measure user engagement. J. Am. Soc. Inf. Sci. Technol. 61(1), 50–69 (2009)Google Scholar
  13. 13.
    Reuter, K.: Assessing aesthetic relevance: children’s book selection in a digital library. JASIST 58(12), 1745–1763 (2007)CrossRefGoogle Scholar
  14. 14.
    Toms, E., Hall, M.M.: The chic interactive task (chici) at clef2013 (2013). http://www.clef-initiative.eu/documents/71612/1713e643-27c3-4d76-9a6f-926cdb1db0f4
  15. 15.
    Vakkari, P.: A theory of the task-based information retrieval process: a summary and generalisation of a longitudinal study. J. Documentation 57(1), 44–60 (2001)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Marijn Koolen
    • 1
    • 2
  • Toine Bogers
    • 3
  • Maria Gäde
    • 4
  • Mark Hall
    • 5
  • Iris Hendrickx
    • 6
  • Hugo Huurdeman
    • 1
  • Jaap Kamps
    • 1
  • Mette Skov
    • 7
  • Suzan Verberne
    • 6
  • David Walsh
    • 5
  1. 1.University of AmsterdamAmsterdamThe Netherlands
  2. 2.Netherlands Institute for Sound and VisionHilversumThe Netherlands
  3. 3.Aalborg University CopenhagenCopenhagenDenmark
  4. 4.Humboldt University BerlinBerlinGermany
  5. 5.CLS/CLSTRadboud UniversityNijmegenThe Netherlands
  6. 6.Edge Hill UniversityOrmskirkUK
  7. 7.Aalborg UniversityAalborgDenmark

Personalised recommendations