Advertisement

Reducing Dimensionality in Multiple Instance Learning with a Filter Method

  • Amelia Zafra
  • Mykola Pechenizkiy
  • Sebastián Ventura
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6077)

Abstract

In this article, we describe a feature selection algorithm which can automatically find relevant features for multiple instance learning. Multiple instance learning is considered an extension of traditional supervised learning where each example is made up of several instances and there is no specific information about particular instance labels. In this scenario, traditional supervised learning can not be applied directly and it is necessary to design new techniques. Our approach is based on principles of the well-known Relief-F algorithm which is extended to select features in this new learning paradigm by modifying the distance, the difference function and computation of the weight of the features. Four different variants of this algorithm are proposed to evaluate their performance in this new learning framework. Experiment results using a representative number of different algorithms show that predictive accuracy improves significantly when a multiple instance learning classifier is learnt on the reduced data set.

Keywords

Feature Selection Feature Selection Method Multiple Instance Feature Selection Algorithm Multiple Instance Learn 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Dietterich, T.G., Lathrop, R.H., Lozano-Perez, T.: Solving the multiple instance problem with axis-parallel rectangles. Artifical Intelligence 89(1-2), 31–71 (1997)zbMATHCrossRefGoogle Scholar
  2. 2.
    Zhang, M.L., Zhou, Z.H.: Improve multi-instance neural networks through feature selection. Neural Processing Letter 19(1), 1–10 (2004)zbMATHCrossRefGoogle Scholar
  3. 3.
    Chen, Y., Bi, J., Wang, J.: Miles: Multiple-instance learning via embedded instance selection. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(12), 1931–1947 (2006)CrossRefGoogle Scholar
  4. 4.
    Yuan, X., Hua, X.S., Wang, M., Qi, G.J., Wu, X.Q.: A novel multiple instance learning approach for image retrieval based on adaboost feature selection. In: ICME 2007: Proceedings of the IEEE International Conference on Multimedia and Expo., Beijing, China, pp. 1491–1494. IEEE, Los Alamitos (2007)CrossRefGoogle Scholar
  5. 5.
    Raykar, V.C., Krishnapuram, B., Bi, J., Dundar, M., Rao, R.B.: Bayesian multiple instance learning: automatic feature selection and inductive transfer. In: ICML 2008: Proceedings of the 25th International Conference on Machine Learning, pp. 808–815. ACM, New York (2008)CrossRefGoogle Scholar
  6. 6.
    Herman, G., Ye, G., Xu, J., Zhang, B.: Region-based image categorization with reduced feature set. In: Proceedings of the 10th IEEE Workshop on Multimedia Signal Processing, Cairns, Qld, pp. 586–591 (2008)Google Scholar
  7. 7.
    Kononenko, I.: Estimating attributes: analysis and extension of relief. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)Google Scholar
  8. 8.
    Chevaleyre, Y.Z., Zucker, J.D.: Solving multiple-instance and multiple-part learning problems with decision trees and decision rules. Application to the mutagenesis problem. In: Stroulia, E., Matwin, S. (eds.) Canadian AI 2001. LNCS (LNAI), vol. 2056, pp. 204–214. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  9. 9.
    Zhang, D., Wang, F., Si, L., Li, T.: M3IC: Maximum margin multiple instance clustering, pp. 1339–1344 (2009)Google Scholar
  10. 10.
    Zhang, M.L., Zhou, Z.H.: Multi-instance clustering with applications to multi-instance prediction. Applied Intelligence 31, 47–68 (2009)CrossRefGoogle Scholar
  11. 11.
    Edgar, G.: Measure, topology, and fractal geometry, 3rd edn. Springer, Heidelberg (1995)Google Scholar
  12. 12.
    Cohen, H.: Image restoration via n-nearest neighbour classification. In: ICIP 1996: Proceedings of the International Conference on Image Processing, pp. 1005–1007 (1996)Google Scholar
  13. 13.
    Yang, C., Lozano-Perez, T.: Image database retrieval with multiple-instance learning techniques. In: ICDE 2000: Proceedings of the 16th International Conference on Data Engineering, Washington, DC, USA, pp. 233–243. IEEE Computer Society, Los Alamitos (2000)Google Scholar
  14. 14.
    Pao, H.T., Chuang, S.C., Xu, Y.Y., Fu, H.: An EM based multiple instance learning method for image classification. Expert Systems with Applications 35(3), 1468–1472 (2008)CrossRefGoogle Scholar
  15. 15.
    Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)zbMATHGoogle Scholar
  16. 16.
    Demsar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 17, 1–30 (2006)MathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Amelia Zafra
    • 1
  • Mykola Pechenizkiy
    • 2
  • Sebastián Ventura
    • 1
  1. 1.Department of Computer Science and Numerical AnalysisUniversity of Cordoba 
  2. 2.Department of Computer ScienceEindhoven University of Technology 

Personalised recommendations