Skip to main content

Using Hyper-Heuristic to Select Leader and Archiving Methods for Many-Objective Problems

  • Conference paper
  • First Online:
Evolutionary Multi-Criterion Optimization (EMO 2015)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9018))

Included in the following conference series:

Abstract

Multi-objective Particle Swarm Optimization (MOPSO) is a promising meta-heuristic to solve Many-Objective Problems (MaOPs). Previous works have proposed different leader and archiving methods to tackle the challenges caused by the increase in the number of objectives, however, selecting the most appropriate components for a given problem is not a trivial task. Moreover, the algorithm can take advantage by using a variety of methods in different phases of the search. To deal with those issues, we adopt the use of hyper-heuristics, whose concept emerges for dynamically selecting components for effectively solving a problem. In this work, we use a simple hyper-heuristic to select leader and archiving methods during the search. Unlike other studies, our hyper-heuristic is guided by the $R_2$ indicator due to its good measuring characteristics and low computational cost. Experimental studies were conducted to validate the new algorithm where its performance is compared to its components individually and to the state-of-the-art MOEA/D-DRA algorithm. The results show that the new algorithm is robust, presenting good results in different situations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bilgin, B., Özcan, E., Korkmaz, E.: An experimental study on hyper-heuristics and exam timetabling. In: Burke, E., Rudová, H. (eds.) PATAT 2007. LNCS, vol. 3867, pp. 394–412. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  2. Britto, A., Pozo, A.: Using archiving methods to control convergence and diversity for many-objective problems in particle swarm optimization. In: 2012 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8, June 2012

    Google Scholar 

  3. Brockhoff, D., Wagner, T., Trautmann, H.: On the properties of the R2 indicator. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, GECCO 2012, pp. 465–472. ACM, New York (2012). http://doi.acm.org/10.1145/2330163.2330230

  4. Burke, E.K., Gendreau, M., Hyde, M., Kendall, G., Ochoa, G., Ozcan, E., Qu, R.: Hyper-heuristics: a survey of the state of the art. J. Oper. Res. Soc. 64(12), 1695–1724 (2013)

    Article  Google Scholar 

  5. Castro, Jr., O.R., Britto, A., Pozo, A.: A comparison of methods for leader selection in many-objective problems. In: IEEE Congress on Evolutionary Computation, pp. 1–8, June 2012

    Google Scholar 

  6. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: Proceedings of the 2002 Congress on Evolutionary Computation, CEC 2002, vol. 1, pp. 825–830, May 2002

    Google Scholar 

  7. Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimisation: NSGA-II. In: Deb, K., Rudolph, G., Lutton, E., Merelo, J.J., Schoenauer, M., Schwefel, H.-P., Yao, X. (eds.) PPSN 2000. LNCS, vol. 1917, pp. 849–858. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  8. Demsar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MATH  MathSciNet  Google Scholar 

  9. Hansen, M.P., Jaszkiewicz, A.: Evaluating the quality of approximations to the non-dominated set. Tech. Rep. IMM-REP-1998-7, Technical University of Denmark, March 1998

    Google Scholar 

  10. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, November/December 1995

    Google Scholar 

  11. Kruskal, W.H., Wallis, W.A.: Use of ranks in one-criterion variance analysis. Journal of the American Statistical Association 47(260), 583–621 (1952)

    Article  MATH  Google Scholar 

  12. Laumanns, M., Zenklusen, R.: Stochastic convergence of random search methods to fixed size pareto front approximations. European Journal of Operational Research 213(2), 414–421 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  13. von Lücken, C., Barán, B., Brizuela, C.: A survey on multi-objective evolutionary algorithms for many-objective problems. Computational Optimization and Applications 58(3), 707–756 (2014). http://dx.doi.org/10.1007/s10589-014-9644-1

    MATH  MathSciNet  Google Scholar 

  14. Mostaghim, S., Teich, J.: Strategies for finding good local guides in multi-objective particle swarm optimization (MOPSO). In: Proceedings of the 2003 IEEE Swarm Intelligence Symposium, SIS 2003, pp. 26–33, April 2003

    Google Scholar 

  15. Nebro, A., Durillo, J., Garcia-Nieto, J., Coello Coello, C.A., Luna, F., Alba, E.: SMPSO: A new PSO-based metaheuristic for multi-objective optimization. In: Computational Intelligence in Multi-Criteria Decision-Making, pp. 66–73, March 2009

    Google Scholar 

  16. Padhye, N., Branke, J., Mostaghim, S.: Empirical comparison of MOPSO methods: guide selection and diversity preservation. In: Proceedings of the Eleventh Congress on Evolutionary Computation, CEC 2009, pp. 2516–2523. IEEE Press, Piscataway (2009)

    Google Scholar 

  17. Schutze, O., Esquivel, X., Lara, A., Coello Coello, C.A.: Using the averaged hausdorff distance as a performance measure in evolutionary multiobjective optimization. IEEE Transactions on Evolutionary Computation 16(4), 504–522 (2012)

    Article  Google Scholar 

  18. While, L., Bradstreet, L., Barone, L.: A fast way of calculating exact hypervolumes. IEEE Transactions on Evolutionary Computation 16(1), 86–95 (2012)

    Article  Google Scholar 

  19. Zhang, Q., Liu, W., Li, H.: The performance of a new version of MOEA/D on CEC 2009 unconstrained MOP test instances. In: IEEE Congress on Evolutionary Computation, CEC09, pp. 203–208, May 2009

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olacir R. Castro Jr. .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Castro, O.R., Pozo, A. (2015). Using Hyper-Heuristic to Select Leader and Archiving Methods for Many-Objective Problems. In: Gaspar-Cunha, A., Henggeler Antunes, C., Coello, C. (eds) Evolutionary Multi-Criterion Optimization. EMO 2015. Lecture Notes in Computer Science(), vol 9018. Springer, Cham. https://doi.org/10.1007/978-3-319-15934-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-15934-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-15933-1

  • Online ISBN: 978-3-319-15934-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics