Advertisement

Multiple-Instance Learning with Evolutionary Instance Selection

  • Yongshan Zhang
  • Jia WuEmail author
  • Chuan Zhou
  • Peng Zhang
  • Zhihua Cai
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9642)

Abstract

Multiple-Instance Learning (MIL) represents a new class of supervised learning tasks, where training examples are bags of instances with labels only available for the bags. To solve the instance label ambiguity, instance selection based MIL models were proposed to convert bag learning to traditional vector learning. However, existing MIL instance selection approaches are all based on the instances inside the bags. In this case, at the original instance space, those potential informative instances, which do not occur in the bags are discarded. In this paper, we propose a novel learning method, MILEIS (Multiple-Instance Learning with Evolutionary Instance Selection), to adaptively determine the informative instances for feature mapping. The unique evolutionary search mechanism, including instance initialization, mutation, and crossover, ensures that MILEIS can adjust itself to the data without explicit specification of functional or distributional form for the underlying model. By doing so, MILEIS can also take full advantage of those creative informative instances to help feature mapping in an accurate way. Experiments and comparisons on real-world applications demonstrate the effectiveness of the proposed method.

Keywords

Multiple-instance learning Instance selection Feature mapping Evolutionary machine learning Classification 

Notes

Acknowledgments

This work was supported in part by the National Nature Science Foundation of China (No. 61403351), the China Scholarship Council Foundation (No. 201206410056), the key project of the Natural Science Foundation of Hubei province, China under Grant No. 2013CFA004, the Australian Research Council (ARC) Discovery Projects under Grant No. DP140100545, the Self-Determined and Innovative Research Founds of CUG (No. 1610491T05) and the National College Students’ Innovation Entrepreneurial Training Plan of CUG (WuHan) (No. 201410491083).

References

  1. 1.
    Dietterich, T.G., Lathrop, R.H., Lozano-Pérez, T.: Solving the multiple instance problem with axis-parallel rectangles. Artif. Intell. 89(1), 31–71 (1997)CrossRefzbMATHGoogle Scholar
  2. 2.
    Andrews, S., Tsochantaridis, I., Hofmann, T.: Support vector machines for multiple-instance learning. Adv. Neural Inf. Process. Syst. 15(2), 561–568 (2002)Google Scholar
  3. 3.
    Maron, O., Lozano-Prez, T.: A framework for multiple-instance learning. Adv. Neural Inf. Process. Syst. 200(2), 570–576 (1998)Google Scholar
  4. 4.
    Ray, S., Craven, M.: Supervised versus multiple instance learning: an empirical comparison. In: ICML, pp. 697–704 (2005)Google Scholar
  5. 5.
    Zhao, Z., Gang, F., Sheng, L., Elokely, K.M., Doerksen, R.J., Chen, Y., Wilkins, D.E.: Drug activity prediction using multiple-instance learning via joint instance and feature selection. BMC Bioinform. 14(Suppl. 14), 535–536 (2013)Google Scholar
  6. 6.
    Wu, J., Zhu, X., Zhang, C., Yu, P.: Bag constrained structure pattern mining for multi-graph classification. IEEE Trans. Knowl. Data Eng. 26(10), 2382–2396 (2014)CrossRefGoogle Scholar
  7. 7.
    Wu, J., Pan, S., Zhu, X., Cai, Z.: Boosting for multi-graph classification. IEEE Trans. Cybern. 45(3), 416–429 (2015)CrossRefGoogle Scholar
  8. 8.
    Hong, R., Meng, W., Yue, G., Tao, D., Li, X., Wu, X.: Image annotation by multiple-instance learning with discriminative feature mapping and selection. IEEE Trans. Cybern. 44(5), 669–680 (2014)CrossRefGoogle Scholar
  9. 9.
    Zhou, Z.H., Jiang, K., Li, M.: Multi-instance learning based web mining. Appl. Intell. 22(2), 135–147 (2005)CrossRefGoogle Scholar
  10. 10.
    Ali, K., Saenko, K.: Confidence-rated multiple instance boosting for object detection. In: CVPR, pp. 2433–2440 (2014)Google Scholar
  11. 11.
    Zhang, M.L., Zhou, Z.H.: Adapting RBF neural networks to multi-instance learning. Neural Process. Lett. 23(1), 1–26 (2006)CrossRefGoogle Scholar
  12. 12.
    Yuan, H., Fang, M., Zhu, X.: Hierarchical sampling for multi-instance ensemble learning. IEEE Trans. Knowl. Data Eng. 25(12), 2900–2905 (2013)CrossRefGoogle Scholar
  13. 13.
    Xu, X., Frank, E.: Logistic regression and boosting for labeled bags of instances. In: Dai, H., Srikant, R., Zhang, C. (eds.) PAKDD 2004. LNCS (LNAI), vol. 3056, pp. 272–281. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  14. 14.
    Dong, L.: A comparison of multi-instance learning algorithms. University of Waikato (2006)Google Scholar
  15. 15.
    Chen, Y., Bi, J., Wang, J.: Miles: multiple-instance learning via embedded instance selection. IEEE Trans. Pattern Anal. Mach. Intell. 28(12), 1931–1947 (2006)CrossRefGoogle Scholar
  16. 16.
    Fu, Z., Robles-Kelly, A., Zhou, J.: Milis: multiple instance learning with instance selection. IEEE Trans. Pattern Anal. Mach. Intell. 33(5), 958–977 (2011)CrossRefGoogle Scholar
  17. 17.
    Amores, J.: Multiple instance classification: Review, taxonomy and comparative study. Artif. Intell. 201(4), 81–105 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Kim, J.S., Scott, C.D.: Robust kernel density estimation. J. Mach. Learn. Res. 13(1), 2529–2565 (2012)MathSciNetzbMATHGoogle Scholar
  19. 19.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: Theory and applications. Neurocomputing 70(1), 489–501 (2006)CrossRefGoogle Scholar
  20. 20.
    Wang, H., Rahnamayan, S., Sun, H., Omran, M.: Gaussian bare-bones differential evolution. IEEE Trans. Cybern. 43(2), 634–647 (2013)CrossRefGoogle Scholar
  21. 21.
    Wu, J., Pan, S., Zhu, X., Zhang, P., Zhang, C.: SODE: self-adaptive one-dependence estimators for classification. Pattern Recogn. 51, 358–377 (2016)CrossRefGoogle Scholar
  22. 22.
    Wu, J., Zhu, X., Zhang, C., Cai, Z.: Multi-instance multi-graph dual embedding learning. In: ICDM, pp. 827–836 (2013)Google Scholar
  23. 23.
    Carson, C., Belongie, S., Greenspan, H., Malik, J.: Blobworld: image segmentation using expectation-maximization and its application to image querying. IEEE Trans. Pattern Anal. Mach. Intell. 24(8), 1026–1038 (2002)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Yongshan Zhang
    • 1
  • Jia Wu
    • 2
    Email author
  • Chuan Zhou
    • 3
  • Peng Zhang
    • 2
  • Zhihua Cai
    • 1
  1. 1.Department of Computer ScienceChina University of GeosciencesWuhanChina
  2. 2.The Centre for Quantum Computation and Intelligent Systems (QCIS)University of Technology SydneySydneyAustralia
  3. 3.Institute of Information EngineeringChinese Academy of SciencesBeijingChina

Personalised recommendations