Principles of Multi-kernel Data Mining

  • Vadim Mottl
  • Olga Krasotkina
  • Oleg Seredin
  • Ilya Muchnik
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3587)

Abstract

The scientific community has accumulated an immense experience in processing data represented in finite-dimensional linear spaces of numerical features of entities, whereas the kit of mathematical instruments for dissimilarity-based processing of data in metric spaces representing distances between entities, for which sufficiently informative features cannot be found, is much poorer. In this work, the problem of embedding the given set of entities into a linear space with inner product by choosing an appropriate kernel function is considered as the major challenge in the featureless approach to estimating dependences in data sets of arbitrary kind. As a rule, several kernels may be heuristically suggested within the bounds of the same data analysis problem. We treat several kernels on a set of entities as Cartesian product of the respective number of linear spaces, each supplied with a specific kernel function as a specific inner product. The main requirement here is to avoid discrete selection in eliminating redundant kernels with the purpose of achieving acceptable computational complexity of the fusion algorithm.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Vapnik, V.: Statistical Learning Theory. John Wiley & Sons, Inc., Chichester (1998)MATHGoogle Scholar
  2. 2.
    Aizerman, M.A., Braverman, E.M., Rozonoer, L.I.: Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control 25, 821–837 (1964)MathSciNetGoogle Scholar
  3. 3.
    Duin, R.P.W., De Ridder, D., Tax, D.M.J.: Featureless classification. In: Proceedings of the Workshop on Statistical Pattern Recognition, Prague (June 1997)Google Scholar
  4. 4.
    Duin, R.P.W., De Ridder, D., Tax, D.M.J.: Experiments with a featureless approach to pattern recognition. Pattern Recognition Letters 18(11-13), 1159–1166 (1997)CrossRefGoogle Scholar
  5. 5.
    Duin, R.P.W., Pekalska, E., De Ridder, D.: Relational discriminant analysis. Pattern Recognition Letters 20(11-13), 1175–1181 (1999)CrossRefGoogle Scholar
  6. 6.
    Bishop, C.M., Tipping, M.E.: Variational relevance vector machines. In: Boutilier, C., Goldszmidt, M. (eds.) Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence, pp. 46–53. Morgan Kaufmann, San Francisco (2000)Google Scholar
  7. 7.
    Mottl, V.V.: Metric spaces admitting linear operations and inner product. Doklady Mathematics 67(1), 140–143 (2003)MATHGoogle Scholar
  8. 8.
    Mottl, V., Seredin, O., Dvoenko, S., Kulikowski, C., Muchnik, I.: Featureless pattern recognition in an imaginary Hilbert space. In: Proceedings of the 15th International Conference on Pattern Recognition, Quebec City, Canada, August 11-15 (2002)Google Scholar
  9. 9.
    Kolmogorov, A.N., Fomin, S.V.: Introductory Real Analysis. Prentice-Hall, Englewood Cliffs (1970)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Vadim Mottl
    • 1
  • Olga Krasotkina
    • 1
  • Oleg Seredin
    • 1
  • Ilya Muchnik
    • 2
  1. 1.Computing Center of the Russian Academy of SciencesMoscowRussia
  2. 2.DIMACSRutgers UniversityPiscatawayUSA

Personalised recommendations