A Simple Feature Extraction for High Dimensional Image Representations

  • Christian Savu-Krohn
  • Peter Auer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3940)


We investigate a method to find local clusters in low dimensional subspaces of high dimensional data, e.g. in high dimensional image descriptions. Using cluster centers instead of the full set of data will speed up the performance of learning algorithms for object recognition, and might also improve performance because overfitting is avoided. Using the Graz01 database, our method outperforms a current standard method for feature extraction from high dimensional image representations.


High Dimensional Data Subspace Cluster Weak Hypothesis Dense Interval Subspace Dimension 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Lowe, D.: Object recognition from local scale-invariant features. In: Seventh International Conference on Computer Vision, pp. 1150–1157 (1999)Google Scholar
  2. 2.
    Mikolajczyk, K., Schmid, C.: Indexing based on scale invariant interest points. In: Proceedings of the 8th International Conference on Computer Vision, Vancouver, Canada, pp. 525–531 (2001)Google Scholar
  3. 3.
    Mikolajczyk, K., Schmid, C.: An affine invariant interest point detector. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2350, pp. 128–142. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  4. 4.
    Opelt, A., Fussenegger, M., Pinz, A., Auer, P.: Weak hypotheses and boosting for generic object detection and recognition. In: Pajdla, T., Matas, J(G.) (eds.) ECCV 2004. LNCS, vol. 3022, pp. 71–84. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  5. 5.
    Dance, C., Willamowski, J., Csurka, G., Bray, C.: Categorizing nine visual classes with bags of keypoints (2004)Google Scholar
  6. 6.
    Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
  7. 7.
    Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Hinneburg, A., Aggarwal, C.C., Keim, D.A.: What is the nearest neighbor in high dimensional spaces? The VLDB Journal, 506–515 (2000)Google Scholar
  9. 9.
    Aggarwal, C.C., Yu, S.P.: Finding generalized projected clusters in high dimensional spaces. In: SIGMOD 2000: Proceedings of the 2000 ACM SIGMOD international conference on Management of data, pp. 70–81. ACM Press, New York (2000)CrossRefGoogle Scholar
  10. 10.
    Böhm, C., Kailing, K., Kriegel, H.P., Kröger, P.: Density connected clustering with local subspace preferences. In: Proceedings of the 4th IEEE International Conference on Data Mining, pp. 27–34 (2004)Google Scholar
  11. 11.
    Haussler, D.: Decision theoretic generalizations of the pac model for neural net and other learning applications. Inf. Comput. 100, 78–150 (1992)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    Kearns, M.J., Schapire, R.E., Sellie, L.: Toward efficient agnostic learning. Computational Learing Theory, 341–352 (1992)Google Scholar
  13. 13.
    Long, P.M.: The complexity of learning according to two models of a drifting environment. Machine Learning 37, 337–354 (1999)CrossRefMATHGoogle Scholar
  14. 14.
    Demiriz, A., Bennett, K.P., Shawe-Taylor, J.: Linear Programming Boosting via Column Generation. Machine Learning 46, 225–254 (2002)CrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Christian Savu-Krohn
    • 1
  • Peter Auer
    • 1
  1. 1.Chair of Information Technology (CIT)University of LeobenAustria

Personalised recommendations