Natural Gradient Approach for Linearly Constrained Continuous Optimization

  • Youhei Akimoto
  • Shinichi Shirakawa
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8672)


When a feasible set of an optimization problem is a proper subset of a multidimensional real space and the optimum of the problem is located on or near the boundary of the feasible set, most evolutionary algorithms require a constraint handling machinery to generate better candidate solutions in the feasible set. However, some standard constraint handling such as a resampling strategy affects the distribution of the candidate solutions; the distribution is truncated into the feasible set. Then, the statistical meaning of the update of the distribution parameters will change. To construct the parameter update rule for the covariance matrix adaptation evolution strategy from the same principle as unconstrained cases, namely the natural gradient principle, we derive the natural gradient of the log-likelihood of the Gaussian distribution truncated into a linearly constrained feasible set. We analyze the novel parameter update on a minimization of a spherical function with a linear constraint.


Weight Scheme Constraint Boundary Continuous Optimization Natural Gradient Constraint Handling 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Akimoto, Y.: Analysis of a Natural Gradient Algorithm on Monotonic Convex-Quadratic-Composite Functions. In: Genetic and Evolutionary Computation Conference, pp. 1293–1300 (2012)Google Scholar
  2. 2.
    Akimoto, Y., Nagata, Y., Ono, I., Kobayashi, S.: Bidirectional relation between CMA evolution strategies and natural evolution strategies. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN XI. LNCS, vol. 6238, pp. 154–163. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  3. 3.
    Akimoto, Y., Nagata, Y., Ono, I., Kobayashi, S.: Theoretical Foundation for CMA-ES from Information Geometry Perspective. Algorithmica 64, 698–716 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Arnold, D.V.: Analysis of a repair mechanism for the (1, λ)-ES applied to a simple constrained problem. In: Genetic and Evolutionary Computation Conference, pp. 853–860 (2011)Google Scholar
  5. 5.
    Arnold, D.V.: On the behaviour of the (1, λ)-ES for a simple constrained problem. In: Foundations of Genetic Algorithms, pp. 15–24 (2011)Google Scholar
  6. 6.
    Arnold, D.V., Hansen, N.: A (1 + 1)-CMA-ES for constrained optimisation. In: Genetic and Evolutionary Computation Conference Conference, pp. 297–304 (2012)Google Scholar
  7. 7.
    Glasmachers, T., Schaul, T., Sun, Y., Wierstra, D., Schmidhuber, J.: Exponential Natural Evolution Strategies. In: Genetic and Evolutionary Computation Conference, pp. 393–400 (2010)Google Scholar
  8. 8.
    Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 noisy testbed. In: Companion on Genetic and Evolutionary Computation Conference (2009)Google Scholar
  9. 9.
    Hansen, N., Auger, A.: Principled Design of Continuous Stochastic Search: From Theory to Practice. In: Borenstein, Y., Moraglio, A. (eds.) Theory and Principled Methods for the Design of Metaheuristics. Springer (2013)Google Scholar
  10. 10.
    Hansen, N., Muller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation 11(1), 1–18 (2003)CrossRefGoogle Scholar
  11. 11.
    Hansen, N., Niederberger, A.S.P., Guzzella, L., Koumoutsakos, P.: A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion. IEEE Transactions on Evolutionary Computation 13(1), 180–197 (2009)CrossRefGoogle Scholar
  12. 12.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9(2), 159–195 (2001)CrossRefGoogle Scholar
  13. 13.
    Harada, K., Sakuma, J., Ono, I., Kobayashi, S.: Constraint-handling method for multi-objective function optimization: Pareto descent repair operator. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 156–170. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  14. 14.
    Harville, D.A.: Matrix Algebra from a Statistician’s Perspective. Springer (2008)Google Scholar
  15. 15.
    Kramer, O.: A review of constraint-handling techniques for evolution strategies. Applied Computational Intelligence and Soft Computing 2010 (2010)Google Scholar
  16. 16.
    Ollivier, Y., Arnold, L., Auger, A., Hansen, N.: Information-geometric optimization algorithms: A unifying picture via invariance principles (2011),
  17. 17.
    Rechenberg, I.: Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Frommann-Holzboog (1973)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Youhei Akimoto
    • 1
  • Shinichi Shirakawa
    • 2
  1. 1.Faculty of EngineeringShinshu UniversityNaganoJapan
  2. 2.College of Science and EngineeringAoyama Gakuin UniversitySagamiharaJapan

Personalised recommendations