Advertisement

Constraint-Handling Method for Multi-objective Function Optimization: Pareto Descent Repair Operator

  • Ken Harada
  • Jun Sakuma
  • Isao Ono
  • Shigenobu Kobayashi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4403)

Abstract

Among the multi-objective optimization methods proposed so far, Genetic Algorithms (GA) have been shown to be more effective in recent decades. Most of such methods were developed to solve primarily unconstrained problems. However, many real-world problems are constrained, which necessitates appropriate handling of constraints. Despite much effort devoted to the studies of constraint-handling methods, it has been reported that each of them has certain limitations. Hence, further studies for designing more effective constraint-handling methods are needed.

For this reason, we investigated the guidelines for a method to effectively handle constraints. Based on these guidelines, we designed a new constraint-handling method, Pareto Descent Repair operator (PDR), in which ideas derived from multi-objective local search and gradient projection method are incorporated. An experiment comparing GA that use PDR and some of the existing constraint-handling methods confirmed the effectiveness of PDR.

Keywords

Feasible Solution Death Penalty Constraint Function Constraint Violation Descent Direction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Deb, K.: Multi-Objective Optimization Using Evolutionary Algorithms. John Wiley & Sons, Chichester (2001)zbMATHGoogle Scholar
  2. 2.
    Coello, C.A.C.: Theoretical and numerical constraint handling techniques used with evolutionary algorithms: A survey of the state of the art. Computer Methods in Applied Mechanics and Engineering 191(11-12), 1245–1287 (2002)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Knowles, J.D., Corne, D.W.: Memetic algorithms for multiobjective optimization: issues, methods and prospects. In: Krasnogor, N., Smith, J.E., Hart, W.E. (eds.) Recent Advances in Memetic Algorithms, pp. 313–352. Springer, Heidelberg (2004)Google Scholar
  4. 4.
    Coello, C.A.C.: Treating constraints as objectives for single-objective evolutionary optimization. Engineering Optimization 32(3), 275–308 (2000)CrossRefGoogle Scholar
  5. 5.
    Deb, K.: An efficient constraint handling method for genetic algorithms. Computer Methods in Applied Mechanics and Engineering 186, 311–338 (2000)zbMATHCrossRefGoogle Scholar
  6. 6.
    Oyama, A., Shimoyama, K., Fujii, K.: New constraint-handling method for multi-objective multi-constraint evolutionary optimization and its application to space plane design. In: Schilling, R., Haase, W., Periaux, J., Baier, H., Bugeda, G. (eds.) Evolutionary and Deterministic Methods for Design, Optimization and Control with Applications to Industrial and Societal Problems (EUROGEN 2005), pp. 416–428 (2005)Google Scholar
  7. 7.
    Michalewicz, Z., Nazhiyath, G.: Genocop III: A co-evolutionary algorithm for numerical optimization problems with nonlinear constraints. In: Proceedings of the 2nd IEEE International Conference on Evolutionary Computation, vol. 2, pp. 647–651. IEEE Computer Society Press, Los Alamitos (1995)CrossRefGoogle Scholar
  8. 8.
    Harada, K., Sakuma, J., Ikeda, K., Ono, I., Kobayashi, S.: Local search for multiobjective function optimization: Pareto descent method ((in Japanese)). Transactions of the Japanese Society for Artificial Intelligence 21(4), 340–350 (2006)Google Scholar
  9. 9.
    Harada, K., Sakuma, J., Ikeda, K., Ono, I., Kobayashi, S.: Local search for multiobjective function optimization: Pareto descent method. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2006), pp. 659–666. ACM Press, New York (2006)CrossRefGoogle Scholar
  10. 10.
    Luenberger, D.G.: Linear and Nonlinear Programming. Addison-Wesley, Reading (1984)zbMATHGoogle Scholar
  11. 11.
    Gellert, W., Gottwald, S., Hellwich, M., Kästner, H., Künstner, H.: VNR Concise Encyclopedia of Mathematics. Van Nostrand Reinhold, New York (1989)zbMATHGoogle Scholar
  12. 12.
    Cormen, T.H., Leiserson, C.E., Rivest, R.L., Stein, C.: Introduction to Algorithms, 2nd edn. MIT Press, Cambridge (2001)zbMATHGoogle Scholar
  13. 13.
    Ono, I., Kobayashi, S.: A real-coded genetic algorithm for function optimization using unimodal normal distribution crossover. In: 7th International Conference on Genetic Algorithms (ICGA7), pp. 246–253 (1997)Google Scholar
  14. 14.
    Knowles, J.D., Corne, D.W.: On metrics for comparing non-dominated sets. In: Proceedings of the 2002 Congress on Evolutionary Computation Conference (CEC02), pp. 711–716. IEEE Computer Society Press, Los Alamitos (2002)Google Scholar
  15. 15.
    Harada, K., Sakuma, J., Kobayashi, S., Ikeda, K., Ono, I.: Hybridization of genetic algorithm and local search in multiobjective function optimization: Recommendation of GA then LS. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2006), pp. 667–674. ACM Press, New York (2006)CrossRefGoogle Scholar
  16. 16.
    Harada, K., Ikeda, K., Sakuma, J., Ono, I., Kobayashi, S.: Hybridization of genetic algorithm with local search in multiobjective function optimization: Recommendation of GA then LS ((in Japanese)). Transactions of the Japanese Society for Artificial Intelligence 21(6), 482–492 (2006)CrossRefGoogle Scholar
  17. 17.
    Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the strength pareto evolutionary algorithm for multiobjective optimization. In: Giannakoglou, K., et al. (eds.) EUROGEN 2001, Evolutionary Methods for Design, Optimization and Control with Applications to Industrial Problems, pp. 12–21 (2001)Google Scholar
  18. 18.
    Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: Proceedings of the Congress on Evolutionary Computation (CEC-2002), pp. 825–830 (2002)Google Scholar
  19. 19.
    Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Mathematical Methods of Operations Research 51(3), 479–494 (2000)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Ken Harada
    • 1
  • Jun Sakuma
    • 1
  • Isao Ono
    • 1
  • Shigenobu Kobayashi
    • 1
  1. 1.Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, 4259 Nagatsuta-cho Midori-ku Yokohama-shi Kanagawa-ken 226-8502Japan

Personalised recommendations