Optimizing Objective Functions with Non-Linearly Correlated Variables Using Evolution Strategies with Kernel-Based Dimensionality Reduction

  • Piotr Lipinski
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8480)


This paper proposes an improvement of Evolutionary Strategies for objective functions with non-linearly correlated variables. It focuses on detecting non-linear local dependencies among variables of the objective function by analyzing the manifold in the search space that contains the current population and transforming individuals to a reduced search space defined by the Kernel Principal Components. Experiments performed on some popular benchmark functions confirm that the method may significantly improve the search process, especially in the case of complex objective functions with a large number of variables, which usually occur in many practical applications.


Objective Function Search Space Dimensionality Reduction Benchmark Function Kernel Principal Component Analysis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Back, T.: Evolutionary Algorithms in Theory and Practice. Oxford University Press (1995)Google Scholar
  2. 2.
    Bishop, C., Svensen, M., Williams, C.: GTM: the Generative Topographic Mapping. Neural Computation 10, 215–234 (1998)CrossRefGoogle Scholar
  3. 3.
    Brockhoff, D., Zitzler, E., Dimensionality Reduction in Multiobjective Optimization with (Partial) Dominance Structure Preservation: Generalized Minimum Objective Subset Problems, TIK Report, no. 247, ETH Zurich (2006)Google Scholar
  4. 4.
    Cox, T., Cox, M., Multidimensional Scaling. Chapman & Hall (2001)Google Scholar
  5. 5.
    Goldberg, D.,E., Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley (1989)Google Scholar
  6. 6.
    Hansen, N., Ostermeier, A.: Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. CEC, 312–317 (1996)Google Scholar
  7. 7.
    Jolliffe, I.T.: Principal Component Analysis, Springer (1986)Google Scholar
  8. 8.
    Larranaga, P., Lozano, J. A.: Estimation of Distribution Algorithms. Kluwer Academic Publishers (2002)Google Scholar
  9. 9.
    Lipinski, P.: Evolution strategies for objective functions with locally correlated variables. In: Fyfe, C., Tino, P., Charles, D., Garcia-Osorio, C., Yin, H. (eds.) IDEAL 2010. LNCS, vol. 6283, pp. 352–359. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  10. 10.
    Roweis, S., Saul, L.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  11. 11.
    Schoelkopf, B., Smola, A. J., Mueller, K.-R.: Nonlinear component analysis as a kernel eigenvalue problem, Technical Report, no. 44, Max Planck Institute (1996)Google Scholar
  12. 12.
    Schwefel, H.-P., Evolution and Optimum Seeking. John Wiley and Sons (1995)Google Scholar
  13. 13.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  14. 14.
    Webb, A.: Statistical Pattern Recognition. John Wiley, London (2002)CrossRefzbMATHGoogle Scholar
  15. 15.
    Whitley, D., Rana, S., Dzubera, J., Mathias, K.: Evaluating evolutionary algorithms. Artificial Intelligence 85(1-2), 245–276 (1996)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Piotr Lipinski
    • 1
  1. 1.Computational Intelligence Research Group, Institute of Computer ScienceUniversity of WroclawWroclawPoland

Personalised recommendations