Advertisement

Optimizing Data Transformations for Classification Tasks

  • José M. Valls
  • Ricardo Aler
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5788)

Abstract

Many classification algorithms use the concept of distance or similarity between patterns. Previous work has shown that it is advantageous to optimize general Euclidean distances (GED). In this paper, data transformations are optimized instead. This is equivalent to searching for GEDs, but can be applied to any learning algorithm, even if it does not use distances explicitly. Two optimization techniques have been used: a simple Local Search (LS) and the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). CMA-ES is an advanced evolutionary method for optimization in difficult continuous domains. Both diagonal and complete matrices have been considered. Results show that in general, complete matrices found by CMA-ES either outperform or match both Local Search, and the classifier working on the original untransformed data.

Keywords

Data transformations General Euclidean Distances Evolutionary Computation Evolutionary-based Machine Learning 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Moody, J.E., Darken, C.: Fast Learning in Networks of Locally Tuned Processing Units. Neural Computation 1, 281–294 (1989)CrossRefGoogle Scholar
  2. 2.
    Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Trans. Inform. Theory 13(1), 21–27 (1967)CrossRefzbMATHGoogle Scholar
  3. 3.
    Atkenson, C.G., Moore, A.W., Schaal, S.: Locally Weighted Learning. Artificial Intelligence Review 11, 11–73 (1997)CrossRefGoogle Scholar
  4. 4.
    Tou, J.T., Gonzalez, R.C.: Pattern Recognition Principles. Addison-Wesley, Reading (1974)zbMATHGoogle Scholar
  5. 5.
    Weisberg, S.: Applied Linear Regression. John Wiley and Sons, New York (1985)zbMATHGoogle Scholar
  6. 6.
    Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)CrossRefzbMATHGoogle Scholar
  7. 7.
    Weinberger, K.Q., et al.: Distance Metric Learning for Large Margin Nearest Neighbor Classification. In: Neural Information Processing Systems (2005)Google Scholar
  8. 8.
    Hansen, N., Ostermeier, A.: Completely Derandomized Self-adaptation in Evolution Strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  9. 9.
    Ostermeier, A., Gawelczyk Nikolaus Hansen, A.: A Derandomized Approach to Self-Adaptation of Evolution Strategies. Evolutionary Computation 4(2), 369–380 (1994)CrossRefGoogle Scholar
  10. 10.
    Valls, J.M., Aler, R., Fernández, O.: Evolving Generalized Euclidean Distances for Training RBNN. Computing and Informatics 26, 33–43 (2007)zbMATHGoogle Scholar
  11. 11.
    Sierra, A., Echeverría, A.: Evolutionary Discriminant Analysis. IEEE Transactions on Evolutionary Computation 10(1), 81–92 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • José M. Valls
    • 1
  • Ricardo Aler
    • 1
  1. 1.Universidad Carlos III de MadridSpain

Personalised recommendations