Advertisement

Applied Intelligence

, Volume 46, Issue 4, pp 865–875 | Cite as

A maximum partial entropy-based method for multiple-instance concept learning

  • Tao Xu
  • Iker Gondra
  • David K.Y. Chiu
Article
  • 210 Downloads

Abstract

Multiple instance (MI) learning aims at identifying the underlying concept from collectively labeled data. A training sample consists of a set, known as a bag, of unlabelled instances. The bag as a whole is labeled positive if at least one instance in the bag is positive, or negative otherwise. Given such training samples, the goal is to learn a description of the common instance(s) among the positive bags, i.e., the underlying concept that is responsible for the positive label. In this work, we introduce a learning scheme based on the notion of partial entropy for MI concept learning. Partial entropy accentuates the intra-class information by focusing on the information reflected from the positive class in proportion to the total entropy, maximization of which is to equalize the likelihoods of intra-class outcomes among the positive class, essentially reflecting the intended concept. When coupled with a distance-based probabilistic model for MI learning, it is equivalent to seeking out a concept estimate that equalizes the intra-class distances while the distance to negative bags is restrained. It produces patterns that are similar to at least one instance from each of the positive bags while dissimilar from all instances in negative bags. The generated patterns from the optimization process correspond to prototypical concepts. Maximum partial entropy is conceptually simple and experimental results on different MI datasets demonstrate its effectiveness in learning an explicit representation of the concept and its competitive performance when applied to classification tasks.

Keywords

Instance-based learning Multiple instance learning Machine learning Weakly supervised learning Data mining Concept learning Partial entropy Knowledge extraction 

References

  1. 1.
    Amores J (2013) Multiple instance classification: review, taxonomy and comparative study. Artificial Intelligence 201:81–105MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Andrews S, Tsochantaridis I, Hofmann T (2003) Support vector machines for multiple-instance learning. In: Advances in Neural Information Processing Systems, vol 15, pp 561–568Google Scholar
  3. 3.
    Auer P (1997) On learning from multi-instance examples: empirical evaluation of a theoretical approach. In: Proceedings of the 4th International Conference on Machine Learning, pp 21–29Google Scholar
  4. 4.
    Babenko B, Yang M, Belongie S (2011) Robust object tracking with online multiple instance learning. IEEE Trans Pattern Anal Mach Intell 33(8):1619–1632CrossRefGoogle Scholar
  5. 5.
    Blum A, Kalai A (1998) A note on learning from multiple-instance examples. Mach Learn 30(1):23–29CrossRefzbMATHGoogle Scholar
  6. 6.
    Bolton J, Gader P, Frigui H, Torrione P (2011) Random set framework for multiple instance learning. Inf Sci 181(11):2061–2070MathSciNetCrossRefGoogle Scholar
  7. 7.
    Bruner JS, Goodnow JJ, Austin GA (1956) A study of thinking. Wiley, New YorkGoogle Scholar
  8. 8.
    Chevaleyre Y, Zucker JD (2001) Solving multiple-instance and multiple-part learning problems with decision trees and rule sets. Application to the mutagenesis problem. In: Proceedings of the 14th Biennial Conference of the Canadian Society on Computational Studies of Intelligence, pp 204–214Google Scholar
  9. 9.
    Chiu DK, Gondra I, Xu T (2013) Future directions in multiple instance learning. Journal of Theoretical and Applied Computer Science 7(3):29–39Google Scholar
  10. 10.
    Dietterich TG, Lathrop RH, Pérez TL (1997) Solving the multiple instance problem with axis-parallel rectangles. Artif Intell 89(1-2):31–71CrossRefzbMATHGoogle Scholar
  11. 11.
    Dollár P, Babenko B, Belongie S, Perona P, Tu Z (2008) Multiple component learning for object detection. In: Proceedings of the 10th European Conference on Computer Vision: Part II, pp 211–224Google Scholar
  12. 12.
    Foulds J, Frank E (2010) A review of multi-instance learning assumptions. Knowl Eng Rev 25(2):1–25CrossRefGoogle Scholar
  13. 13.
    Gärtner T, Flach PA, Kowalczyk A, Smola AJ (2002) Multi-instance kernels. In: Proceedings of the 19th International Conference on Machine Learning, pp 179–186Google Scholar
  14. 14.
    Gondra I, Xu T (2010) A multiple instance learning based framework for semantic image segmentation. Multimedia Tools and Applications 48(2):339–365CrossRefGoogle Scholar
  15. 15.
    Guiasu S (1977) Information Theory with Applications. McGraw-HillGoogle Scholar
  16. 16.
    Hajimirsadeghi H, Mori G (2012) Multiple instance real boosting with aggregation functions. In: Proceedings of the 21st International Conference on Pattern Recognition, pp 2706–2710Google Scholar
  17. 17.
    Leistner C, Saffari A, Bischof H (2010) MIForests: multiple-instance learning with randomized trees. In: Proceedings of the 11th European Conference on Computer vision: Part VI, pp 29–42Google Scholar
  18. 18.
    Li F, Liu R (2012) Multi-graph multi-instance learning for object-based image and video retrieval. In: Proceedings of the 2nd ACM International Conference on Multimedia Retrieval, pp 35:1–35:8Google Scholar
  19. 19.
    Maron O, Pérez TL (1998) A framework for multiple-instance learning. In: Proceedings of the Advances in Neural Information Processing Systems, pp 570–576Google Scholar
  20. 20.
    Mei S, Fei W (2009) Structural domain based multiple instance learning for predicting gram-positive bacterial protein subcellular localization. In: Proceedings of the 2009 International Joint Conference on Bioinformatics, Systems Biology and Intelligent Computing, pp 195–200Google Scholar
  21. 21.
    Qi Z, Xu Y, Wang L, Song Y (2011) Online multiple instance boosting for object detection. Neurocomputing 74(10):1769–1775CrossRefGoogle Scholar
  22. 22.
    Ray S, Craven M (2005) Supervised versus multiple instance learning: an empirical comparison. In: Proceedings of the 22nd International Conference on Machine learning, pp 697–704Google Scholar
  23. 23.
    Raykar VC, Krishnapuram B, Bi J, Dundar M, Rao RB (2008) Bayesian multiple instance learning: automatic feature selection and inductive transfer. In: Proceedings of the 25th international Conference on Machine learning, pp 808–815Google Scholar
  24. 24.
    Riesen K, Bunke H (2008) IAM graph database repository for graph based pattern recognition and machine learning. In: Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Pattern Recognition, pp 287–297Google Scholar
  25. 25.
    Ruffo G (2001) Learning single and multiple instance decision trees for computer security applications PhD thesis. Universita di Torino, ItalyGoogle Scholar
  26. 26.
    Wang C, Zhang L, Zhang HJ (2008a) Graph-based multiple-instance learning for object-based image retrieval. In: Proceedings of the 1st ACM international Conference on Multimedia information retrieval, pp 156–163Google Scholar
  27. 27.
    Wang H, Yang Q, Zha H (2008b) Adaptive p-posterior mixture-model kernels for multiple instance learning. In: Proceedings of the 25th International Conference on Machine Learning, pp 1136–1143Google Scholar
  28. 28.
    Wang J, Zucker JD (2000) Solving the multiple-instance problem: A lazy learning approach. In: Proceedings of the 17th International Conference on Machine Learning, pp 1119–1126Google Scholar
  29. 29.
    Xu T, Gondra I, Chiu D (2011) Adaptive kernel diverse density estimate for multiple instance learning. In: Proceedings of the 7th international Conference on Machine learning and data mining in pattern recognition, pp 185–198Google Scholar
  30. 30.
    Xu T, Chiu D, Gondra I (2012) Constructing target concept in multiple instance learning using maximum partial entropy. In: Proceedings of the 8th international Conference on Machine Learning and Data Mining in Pattern Recognition, pp 169–182Google Scholar
  31. 31.
    Xu X, Frank E (2004) Logistic regression and boosting for labeled bags of instances. In: Proceedings of the PacificAsia Conference on Knowledge Discovery and Data Mining, pp 272–281Google Scholar
  32. 32.
    Zafra A, Pechenizkiy M, Ventura S (2013) HyDR-MI: A hybrid algorithm to reduce dimensionality in multiple instance learning. Inf Sci 222(10):282–301MathSciNetCrossRefGoogle Scholar
  33. 33.
    Zhang ML, Zhou ZH (2006) Adapting RBF neural networks to multi-instance learning. Neural Process Lett 23(1):1–26CrossRefGoogle Scholar
  34. 34.
    Zhang Q, Goldman SA (2001) EM-DD: An improved multiple-instance learning technique. Adv Neural Inf Proces Syst 14:1073–1080Google Scholar
  35. 35.
    Zhou X, Ruan J, Zhang W (2010) Promoter prediction based on a multiple instance learning scheme. In: Proceedings of the First ACM International Conference on Bioinformatics and Computational Biology, pp 295–301Google Scholar
  36. 36.
    Zhou Z, Jiang K, Li M (2005) Multi-instance learning based web mining. Appl Intell 22(2):135–147CrossRefGoogle Scholar
  37. 37.
    Zhou Z, Sun Y, Li Y (2009) Multi-instance learning by treating instances as non-I.I.D. samples. In: Proceedings of the 26th International Conference on Machine Learning, pp 1249– 1256Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.School of Computer ScienceUniversity of GuelphGuelphCanada
  2. 2.Mathematics, Statistics, Computer Science DepartmentSt. Francis Xavier UniversityAntigonishCanada

Personalised recommendations