Gaussian Process Density Counting from Weak Supervision

  • Matthias von Borstel
  • Melih Kandemir
  • Philip Schmidt
  • Madhavi K. Rao
  • Kumar Rajamani
  • Fred A. Hamprecht
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9905)


As a novel learning setup, we introduce learning to count objects within an image from only region-level count information. This level of supervision is weaker than earlier approaches that require segmenting, drawing bounding boxes, or putting dots on centroids of all objects within training images. We devise a weakly supervised kernel learner that achieves higher count accuracies than previous counting models. We achieve this by placing a Gaussian process prior on a latent function the square of which is the count density. We impose non-negativeness and smooth the GP response as an intermediary step in model inference. We illustrate the effectiveness of our model on two benchmark applications: (i) synthetic cell and (ii) pedestrian counting, and one novel application: (iii) erythrocyte counting on blood samples of malaria patients.


Random Forest Gaussian Process Latent Function Count Density Multiple Instance Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Vapnik, V.: Statistical Learning Theory. Springer, New York (1998)zbMATHGoogle Scholar
  2. 2.
    Lempitsky, V., Zisserman, A.: Learning to count objects in images. In: NIPS (2010)Google Scholar
  3. 3.
    Fiaschi, L., Nair, R., Koethe, U., Hamprecht, F., et al.: Learning to count with regression forest and structured labels. In: ICPR (2012)Google Scholar
  4. 4.
    Arteta, C., Lempitsky, V., Noble, J., Zisserman, A.: Interactive object counting. In: ECCV (2014)Google Scholar
  5. 5.
    Ryan, D., Denman, S., Fookes, C., Sridharan, S.: Crowd counting using group tracking and local features. In: AVSS (2010)Google Scholar
  6. 6.
    Cho, S.Y., Chow, T., Leung, C.T.: A neural-based crowd estimation by hybrid global learning algorithm. IEEE Trans. Syst. Man Cybern. Part B Cybern. 29(4), 535–541 (1999)CrossRefGoogle Scholar
  7. 7.
    Kong, D., Gray, D., Tao, H.: A viewpoint invariant approach for crowd counting. In: ICPR (2006)Google Scholar
  8. 8.
    Williams, C., Rasmussen, C.: Gaussian Processes For Machine Learning, vol. 2(3), p. 4. The MIT Press, Cambridge (2006)Google Scholar
  9. 9.
    Hensman, J., Rattray, M., Lawrence, N.: Fast variational inference in the conjugate exponential family. In: NIPS (2012)Google Scholar
  10. 10.
    Bernardis, E., Stella, X.: Pop out many small structures from a very large microscopic image. Med. Image Anal. 15(5), 690–707 (2011)CrossRefGoogle Scholar
  11. 11.
    Mualla, F., Schöll, S., Sommerfeldt, B., Maier, A., Steidl, S., Buchholz, R., Hornegger, J.: Unsupervised unstained cell detection by sift keypoint clustering andself-labeling algorithm. In: MICCAI (2014)Google Scholar
  12. 12.
    Arteta, C., Lempitsky, V., Noble, J., Zisserman, A.: Learning to detect cells usingnon-overlapping extremal regions. In: MICCAI (2012)Google Scholar
  13. 13.
    Kainz, P., Urschler, M., Schulter, S., Wohlhart, P., Lepetit, V.: You should use regression to detect cells. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 276–283. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-24574-4_33 CrossRefGoogle Scholar
  14. 14.
    Ryan, D., Denman, S., Fookes, C., Sridharan, S.: Crowd counting using multiple local features. In: DICTA (2009)Google Scholar
  15. 15.
    Chan, A., Liang, Z.S., Vasconcelos, N.: Privacy preserving crowd monitoring: counting people without people models or tracking. In: CVPR (2008)Google Scholar
  16. 16.
    Chan, A., Vasconcelos, N.: Counting people with low-level features and Bayesian regression. IEEE Trans. Image Process. 21(4), 2160–2177 (2012)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Xie, W., Noble, J., Zisserman, A.: Microscopy cell counting with fully convolutional regression networks. In: MICCAI (2015)Google Scholar
  18. 18.
    Wang, Z., Lan, L., Vucetic, S.: Mixture model for multiple instance regression and applications in remote sensing. IEEE Trans. Geosci. Remote Sens. 50(6), 2226–2237 (2012)CrossRefGoogle Scholar
  19. 19.
    Ray, S., Page, D.: Multiple instance regression. In: ICML (2001)Google Scholar
  20. 20.
    Wagstaff, K., Lane, T.: Salience assignment for multiple-instance regression. Jet Propulsion Laboratory, National Aeronautics and Space Administration, Pasadena, CA (2007)Google Scholar
  21. 21.
    Wagstaff, K., Lane, T., Roper, A.: Multiple-instance regression with structured data. In: ICDMW (2008)Google Scholar
  22. 22.
    Pappas, N., Marconi, R., Popescu-Belis, A.: Explaining the stars: weighted multiple-instance learning for aspect-based sentiment analysis. In: EMNLP (2014)Google Scholar
  23. 23.
    Rasmussen, C., Williams, C.: Gaussian Processes For Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  24. 24.
    Kim, M., de la Torre, F.: Gaussian processes multiple instance learning. In: ICML (2010)Google Scholar
  25. 25.
    Kandemir, M., Zhang, C., Hamprecht, F.: Empowering multiple instancehistopathology cancer diagnosis by cell graphs. In: MICCAI (2014)Google Scholar
  26. 26.
    Snelson, E., Ghahramani, Z.: Local and global sparse Gaussian process approximations. In: AISTATS (2007)Google Scholar
  27. 27.
    Williams, C., Seeger, M.: Using the Nyström method to speed up kernel machines. In: NIPS (2001)Google Scholar
  28. 28.
    Lawrence, N.D. Gaussian process latent variable models for visualisation of high dimensional data. In: NIPS (2004)Google Scholar
  29. 29.
    Lawrence, N.D. Deep Gaussian processes. In: AISTATS (2013)Google Scholar
  30. 30.
    Gelman, A., Carlin, J., Stern, H., Rubin, D.: Bayesian Data Analysis, vol. 2. Chapman & Hall/CRC, Boca Raton (2014)zbMATHGoogle Scholar
  31. 31.
    Hensman, J., Fusi, N., Lawrence, N.: Gaussian processes for big data (2013). arXiv preprint arXiv:1309.6835
  32. 32.
    Hoffman, M., Blei, D., Wang, C., Paisley, J.: Stochastic variational inference. J. Mach. Learn. Res. 14(1), 1303–1347 (2013)MathSciNetzbMATHGoogle Scholar
  33. 33.
    Titsias, M.K., Lawrence, N.D.: Bayesian Gaussian process latent variable model. In: AISTATS (2010)Google Scholar
  34. 34.
    Houlsby, N., Huszar, F., Ghahramani, Z., Hernández-Lobato, J.: Collaborative Gaussian processes for preference learning. In: NIPS (2012)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Matthias von Borstel
    • 1
  • Melih Kandemir
    • 1
  • Philip Schmidt
    • 1
  • Madhavi K. Rao
    • 2
  • Kumar Rajamani
    • 2
  • Fred A. Hamprecht
    • 1
  1. 1.HCIHeidelberg UniversityHeidelbergGermany
  2. 2.Robert Bosch EngineeringBangaloreIndia

Personalised recommendations