Advertisement

LifeCLEF 2019: Biodiversity Identification and Prediction Challenges

  • Alexis Joly
  • Hervé GoëauEmail author
  • Christophe Botella
  • Stefan Kahl
  • Marion Poupard
  • Maximillien Servajean
  • Hervé Glotin
  • Pierre Bonnet
  • Willem-Pier Vellinga
  • Robert Planqué
  • Jan Schlüter
  • Fabian-Robert Stöter
  • Henning Müller
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11438)

Abstract

Building accurate knowledge of the identity, the geographic distribution and the evolution of living species is essential for a sustainable development of humanity, as well as for biodiversity conservation. However, the burden of the routine identification of plants and animals in the field is strongly penalizing the aggregation of new data and knowledge. Identifying and naming living plants or animals is actually almost impossible for the general public and often a difficult task for professionals and naturalists. Bridging this gap is a key challenge towards enabling effective biodiversity information retrieval systems. The LifeCLEF evaluation campaign, presented in this paper, aims at boosting and evaluating the advances in this domain since 2011. In particular, the 2019 edition proposes three data-oriented challenges related to the identification and prediction of biodiversity: (i) an image-based plant identification challenge, (ii) a bird sounds identification challenge and (iii) a location-based species prediction challenge based on spatial occurrence data and environmental tensors.

Keywords

Biodiversity Informatics Machine learning Species identification Species prediction Plant identification Bird identification Species distribution model 

References

  1. 1.
    Cai, J., Ee, D., Pham, B., Roe, P., Zhang, J.: Sensor network for the monitoring of ecosystem: bird species recognition. In: 3rd International Conference on Intelligent Sensors, Sensor Networks and Information, ISSNIP 2007 (2007).  https://doi.org/10.1109/ISSNIP.2007.4496859
  2. 2.
    Gaston, K.J., O’Neill, M.A.: Automated species identification: why not? Philos. Trans. R. Soc. Lond. B: Biol. Sci. 359(1444), 655–667 (2004)CrossRefGoogle Scholar
  3. 3.
    Ghazi, M.M., Yanikoglu, B., Aptoula, E.: Plant identification using deep neural networks via optimization of transfer learning parameters. Neurocomputing 235, 228–235 (2017)CrossRefGoogle Scholar
  4. 4.
    Glotin, H., Clark, C., LeCun, Y., Dugan, P., Halkias, X., Sueur, J.: Proceedings of the 1st Workshop on Machine Learning for Bioacoustics - ICML4B, ICML, Atlanta USA (2013)Google Scholar
  5. 5.
    Glotin, H., LeCun, Y., Artiéres, T., Mallat, S., Tchernichovski, O., H.X.: Proceedings of the Neural Information Processing Scaled for Bioacoustics, from Neurons to Big Data. NIPS International Conference, Tahoe USA (2013). http://sabiod.org/nips4b
  6. 6.
    Goeau, H., Bonnet, P., Joly, A.: Plant identification based on noisy web data: the amazing performance of deep learning (LifeCLEF 2017). In: CLEF 2017 Conference and Labs of the Evaluation Forum, pp. 1–13 (2017)Google Scholar
  7. 7.
    Goëau, H., et al.: The ImageCLEF 2013 plant identification task. In: CLEF, Valencia, Spain (2013)Google Scholar
  8. 8.
    Goëau, H., et al.: The ImageCLEF 2011 plant images classification task. In: CLEF 2011 (2011)Google Scholar
  9. 9.
    Goëau, H., et al.: ImageCLEF 2012 plant images identification task. In: CLEF 2012, Rome (2012)Google Scholar
  10. 10.
    Goëau, H., et al.: The ImageCLEF plant identification task 2013. In: Proceedings of the 2nd ACM International Workshop on Multimedia Analysis for Ecological Data, pp. 23–28. ACM (2013)Google Scholar
  11. 11.
    Joly, A., et al.: Interactive plant identification based on social image data. Ecol. Inform. 23, 22–34 (2014)CrossRefGoogle Scholar
  12. 12.
    Joly, A., et al.: Overview of LifeCLEF 2018: a large-scale evaluation of species identification and recommendation algorithms in the era of AI. In: Bellot, P., et al. (eds.) CLEF 2018. LNCS, vol. 11018, pp. 247–266. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-98932-7_24CrossRefGoogle Scholar
  13. 13.
    Lee, D.J., Schoenberger, R.B., Shiozawa, D., Xu, X., Zhan, P.: Contour matching for a fish recognition and migration-monitoring system. In: Optics East, pp. 37–48. International Society for Optics and Photonics (2004)Google Scholar
  14. 14.
    Lee, S.H., Chan, C.S., Remagnino, P.: Multi-organ plant classification based on convolutional and recurrent neural networks. IEEE Trans. Image Process. 27(9), 4287–4301 (2018)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Poupard, M., Glotin, H., Lengagne, T., Bougrain-Dubourg, A., Bedu, A.L.: Multichannel soundscape recordings of wild versus anthropised new guinea - a first acoustic repertory of some endemic species. LIS DYNI CNRS Toulon Research Report (2018)Google Scholar
  16. 16.
    Towsey, M., Planitz, B., Nantes, A., Wimmer, J., Roe, P.: A toolbox for animal call recognition. Bioacoustics 21(2), 107–125 (2012)CrossRefGoogle Scholar
  17. 17.
    Trifa, V.M., Kirschel, A.N., Taylor, C.E., Vallejo, E.E.: Automated species recognition of antbirds in a mexican rainforest using hidden markov models. J. Acoust. Soc. Am. 123, 2424 (2008)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Alexis Joly
    • 1
  • Hervé Goëau
    • 2
    Email author
  • Christophe Botella
    • 1
    • 3
  • Stefan Kahl
    • 7
  • Marion Poupard
    • 4
  • Maximillien Servajean
    • 8
  • Hervé Glotin
    • 4
  • Pierre Bonnet
    • 2
  • Willem-Pier Vellinga
    • 5
  • Robert Planqué
    • 5
  • Jan Schlüter
    • 4
  • Fabian-Robert Stöter
    • 1
  • Henning Müller
    • 6
  1. 1.Inria, LIRMMMontpellierFrance
  2. 2.CIRAD, UMR AMAPMontpellierFrance
  3. 3.INRA, UMR AMAPMontpellierFrance
  4. 4.AMU, Univ. Toulon, CNRS, ENSAM, LSIS UMR 7296, IUFToulonFrance
  5. 5.Xeno-canto FoundationGroningenThe Netherlands
  6. 6.HES-SOSierreSwitzerland
  7. 7.Chemnitz University of TechnologyChemnitzGermany
  8. 8.LIRMM, Université Paul Valéry, University of Montpellier, CNRSMontpellierFrance

Personalised recommendations