Skip to main content

CLEF-IP 2009: Retrieval Experiments in the Intellectual Property Domain

  • Conference paper
Multilingual Information Access Evaluation I. Text Retrieval Experiments (CLEF 2009)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 6241))

Included in the following conference series:

Abstract

The Clef–Ip track ran for the first time within Clef 2009. The purpose of the track was twofold: to encourage and facilitate research in the area of patent retrieval by providing a large clean data set for experimentation; to create a large test collection of patents in the three main European languages for the evaluation of cross–lingual information access. The track focused on the task of prior art search. The 15 European teams who participated in the track deployed a rich range of Information Retrieval techniques adapting them to this new specific domain and task. A large-scale test collection for evaluation purposes was created by exploiting patent citations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. European Patent Convention (EPC), http://www.epo.org/patents/law/legal-texts

  2. Guidelines for Examination in the European Patent Office (2009), http://www.epo.org/patents/law/legal-texts/guidelines.html

  3. Adams, S.R.: Information sources in patents. K.G. Saur (2006)

    Google Scholar 

  4. Dittenbach, M., Pflugfelder, B., Pesenhofer, A., Roda, G., Berger, H.: Soire: a service-oriented ir evaluation architecture. In: CIKM 2009: Proceeding of the 18th ACM Conference on Information and Knowledge Management, pp. 2101–2102. ACM, New York (2009)

    Chapter  Google Scholar 

  5. Ferro, N., Harman, D.: Dealing with multilingual information access: Grid experiments at trebleclef. In: Agosti, M., Thanos, C. (eds.) Post-proceedings of the Fourth Italian Research Conference on Digital Library Systems, IRCDL 2008 (2008)

    Google Scholar 

  6. Fujii, A., Iwayama, M., Kando, N.: Overview of the Patent Retrieval Task at the NTCIR-6 Workshop. In: Kando, N., Evans, D.K. (eds.) Proceedings of the Sixth NTCIR Workshop Meeting on Evaluation of Information Access Technologies: Information Retrieval, Question Answering, and Cross-Lingual Information Access, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan, pp. 359–365. National Institute of Informatics (2007)

    Google Scholar 

  7. Graf, E., Azzopardi, L.: A Methodology for Building a Patent Test Collection for Prior art Search. In: Proceedings of the Second International Workshop on Evaluating Information Access, EVIA (2008)

    Google Scholar 

  8. Hanbury, A., Müller, H.: Toward automated component–level evaluation. In: SIGIR Workshop on the Future of IR Evaluation, Boston, USA, pp. 29–30 (2009)

    Google Scholar 

  9. Hunt, D., Nguyen, L., Rodgers, M.: Patent searching: tools and techniques. Wiley, Chichester (2007)

    Google Scholar 

  10. Järvelin, K., Kekäläinen, J.: Cumulated gain–based evaluation of IR techniques. ACM Trans. Inf. Syst. 20(4), 422–446 (2002)

    Article  Google Scholar 

  11. Kando, N., Leong, M.-K.: Workshop on Patent Retrieval (SIGIR 2000 Workshop Report). SIGIR Forum 34(1), 28–30 (2000)

    Article  Google Scholar 

  12. Organisation for Economic Co–operation and Development (OECD). OECD Patent Statistics Manual (Feburary 2009)

    Google Scholar 

  13. Piroi, F., Roda, G., Zenz, V.: CLEF-IP 2009 Evaluation Summary (July 2009)

    Google Scholar 

  14. Piroi, F., Roda, G., Zenz, V.: CLEF-IP 2009 Evaluation Summary part II (September 2009)

    Google Scholar 

  15. Roda, G., Zenz, V., Lupu, M., Järvelin, K., Sanderson, M., Womser-Hacker, C.: So Many Topics, So Little Time. SIGIR Forum 43(1), 16–21 (2009)

    Google Scholar 

  16. Smucker, M.D., Allan, J., Carterette, B.: A comparison of statistical significance tests for information retrieval evaluation. In: CIKM 2007: Proceedings of the Sixteenth ACM Conference on Information and Knowledge Management, pp. 623–632. ACM, New York (2007)

    Chapter  Google Scholar 

  17. Tiwana, S., Horowitz, E.: Findcite – automatically finding prior art patents. In: PaIR 2009: Proceeding of the 1st ACM Workshop on Patent Information Retrieval. ACM, New York (to appear)

    Google Scholar 

  18. Voorhees, E.M.: Topic set size redux. In: SIGIR 2009: Proceedings of the 32nd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 806–807. ACM, New York (2009)

    Chapter  Google Scholar 

  19. Voorhees, E.M., Buckley, C.: The effect of topic set size on retrieval experiment error. In: SIGIR 2002: Proceedings of the 25th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 316–323. ACM, New York (2002)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Roda, G., Tait, J., Piroi, F., Zenz, V. (2010). CLEF-IP 2009: Retrieval Experiments in the Intellectual Property Domain. In: Peters, C., et al. Multilingual Information Access Evaluation I. Text Retrieval Experiments. CLEF 2009. Lecture Notes in Computer Science, vol 6241. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15754-7_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-15754-7_47

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-15753-0

  • Online ISBN: 978-3-642-15754-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics