A full variate Gaussian model-based RM-MEDA without clustering process

Original Article
  • 70 Downloads

Abstract

A regularity model-based multi-objective estimation of distribution algorithm (RM-MEDA) is an excellent multi-objective estimation of distribution algorithm proposed in recent years. However, the performance of RM-MEDA is seriously affected by its clustering process. In order to avoid the influence of the clustering process, this paper presents a novel full variate Gaussian model-based (FGM-based) RM-MEDA without clustering process, named FRM-MEDA. In FRM-MEDA, the clustering process is removed from the original algorithm and the full variate Gaussian model (FGM) is introduced to keep the population diversity and make up the loss of the performance caused by removing the clustering process. Meanwhile, the introduction of FGM makes the FRM-MEDA faster and more stable when solving all the test instances. In addition, variable variance of FGM is presented to enhance the exploring ability of FRM-MEDA. The experiments demonstrate that the proposed algorithm significantly outperforms the RM-MEDA without clustering process and the RM-MEDA with K equal to AVE K .

Keywords

Estimation of distribution algorithm Multi-objective optimization Number of clusters Full variate Gaussian model 

Notes

Acknowledgements

This work has been funded by the Project of Science and Technology for Graduate Students with No. CDJXS12180003, and the Scientific and Technological Research Program of Chongqing Municipal Education Commission under Grant No. KJ1400409.

References

  1. 1.
    Taormina R et al (2015) Data-driven input variable selection for rainfall-runoff modeling using binary-coded particle swarm optimization and extreme learning machines. J Hydrol 529(3):1617–1632CrossRefGoogle Scholar
  2. 2.
    Zhang J et al (2009) Multilayer ensemble pruning via novel multi-sub-swarm particle swarm optimization. J Univers Comput Sci 15(4):840–858Google Scholar
  3. 3.
    Wang WC et al (2015) Improving forecasting accuracy of annual runoff time series using ARIMA based on EEMD decomposition. Water Resour Manag 29(8):2655–2675CrossRefGoogle Scholar
  4. 4.
    Zhang SW et al (2009) Dimension reduction using semi-supervised locally linear embedding for plant leaf classification. Lect Notes Comput Sci 5754:948–955CrossRefGoogle Scholar
  5. 5.
    Wu CL et al (2009) Methods to improve neural network performance in daily flows prediction. J Hydrol 372(1–4):80–93CrossRefGoogle Scholar
  6. 6.
    Chau KW et al. (2010) A hybrid model coupled with singular spectrum analysis for daily rainfall prediction. J Hydroinf 12(4):458–473CrossRefGoogle Scholar
  7. 7.
    Larrañaga P, Lozano JA (2002) Estimation of distribution algorithms: a new tool for evolutionary computation [M]. Kluwer Press, BostonCrossRefMATHGoogle Scholar
  8. 8.
    Bengoetxea E, Larrañaga P, Bloch I et al (2001) Estimation of distribution algorithms: a new evolutionary computation approach for graph matching problems[C]. Energy minimization methods in computer vision and pattern recognition. Springer, Berlin, pp 454–469MATHGoogle Scholar
  9. 9.
    Pelikan M, Goldberg DE, Lobo FG (2002) A survey of optimization by building and using probabilistic models [J]. Comput Optim Appl 21(1):5–20MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Laumanns M, Ocenasek J (2002) Bayesian optimization algorithms for multi-objective optimization [M]. Parallel Problem Solving from Nature-PPSN VII. Springer, Berlin, pp 298–307Google Scholar
  11. 11.
    Pelikan M, Sastry K, Goldberg DE (2005) Multi-objective hBOA, clustering, and scalability[C]. In: Proceedings of the 2005 conference on Genetic and evolutionary computation ACM, pp 663–670Google Scholar
  12. 12.
    Sastry K, Goldberg DE, Pelikan M (2005) Limits of scalability of multi-objective estimation of distribution algorithms[C]. In: Evolutionary Computation, 2005. The 2005 IEEE Congress on IEEE 3: pp 2217–2224Google Scholar
  13. 13.
    Wang X, Dong L, Yan J (2012) Maximum ambiguity based sample selection in fuzzy decision tree induction. IEEE Trans Knowl Data Eng 24(8):1491–1505CrossRefGoogle Scholar
  14. 14.
    Valdez SI, Hernández A, Botello S (2013) A Boltzmann based estimation of distribution algorithm [J]. Inf Sci 236(1):126–137MathSciNetCrossRefMATHGoogle Scholar
  15. 15.
    Gonzalez-Fernandez Y, Soto M (2014) Copulaedas: an R package for estimation of distribution algorithms based on copulas [J]. J Stat Softw 58(9):1–34CrossRefGoogle Scholar
  16. 16.
    Maezawa K, Handa H (2015) Memetic algorithms of graph-based estimation of distribution algorithms [C]. In: The 18th Asia Pacific Symposium on Intelligent and Evolutionary Systems (IES 2014). Springer International Publishing 2, pp 647–656Google Scholar
  17. 17.
    Miettinen K (1999) Nonlinear multi-objective optimization [M]. Springer, BerlinGoogle Scholar
  18. 18.
    Ehrgott M (2005) Multicriteria optimization [M]. Springer, BerlinMATHGoogle Scholar
  19. 19.
    Schütze O, Mostaghim S, Dellnitz M et al (2003) Covering Pareto sets by multilevel evolutionary subdivision techniques [C]. Evolutionary multi-criterion optimization. Springer, Berlin, pp 118–132MATHGoogle Scholar
  20. 20.
    Zhang Q, Zhou A, Jin Y (2008) RM-MEDA: a regularity model-based multiobjective estimation of distribution algorithm [J]. Evol Comput IEEE Trans 12(1):41–63CrossRefGoogle Scholar
  21. 21.
    Kambhatla N, Leen TK (1997) Dimension reduction by local principal component analysis [J]. Neural Comput 9(7):1493–1516CrossRefGoogle Scholar
  22. 22.
    Wang Y, Xiang J, Cai Z (2012) A regularity model-based multiobjective estimation of distribution algorithm with reducing redundant cluster operator [J]. Appl Soft Comput 12(11):3526–3538CrossRefGoogle Scholar
  23. 23.
    Deb K, Pratap A, Agarwal S et al. (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II [J]. Evol Comput IEEE Trans 6(2):182–197CrossRefGoogle Scholar
  24. 24.
    Zitzler E, Deb K, Thiele L (2000) Comparison of multi-objective evolutionary algorithms: empirical results [J]. Evol Comput 8(2):173–195CrossRefGoogle Scholar
  25. 25.
    Santana R, Bielza C, Larrañaga P et al. (2010) Mateda-2.0: estimation of distribution algorithms in MATLAB [J]. J Stat Softw 35(7):1–30CrossRefGoogle Scholar
  26. 26.
    Zhang Q (2004) On stability of fixed points of limit models of univariate marginal distribution algorithm and factorized distribution algorithm [J]. Evol Comput IEEE Trans 8(1):80–93CrossRefGoogle Scholar
  27. 27.
    Zhang Q, Muhlenbein H (2004) On the convergence of a class of estimation of distribution algorithms [J]. Evol Comput IEEE Trans 8(2):127–136CrossRefGoogle Scholar
  28. 28.
    Zhang Q (2004) On the convergence of a factorized distribution algorithm with truncation selection [J]. Complexity 9(4):17–23MathSciNetCrossRefGoogle Scholar
  29. 29.
    Nebro AJ, Durillo JJ, Coello CAC et al (2008) A study of convergence speed in multi-objective metaheuristics[M]. Parallel problem solving from nature-PPSN X. Springer, Berlin, pp 763–772CrossRefGoogle Scholar
  30. 30.
    Charnetski JR, Soland RM (1978) Multiple-attribute decision making with partial information: the comparative hypervolume criterion [J]. Naval Res Logist Q 25(2):279–288MathSciNetCrossRefMATHGoogle Scholar
  31. 31.
    Zitzler E, Thiele L (1999) Multi-objective evolutionary algorithms: a comparative case study and the strength Pareto approach [J]. Evol Comput IEEE Trans 3(4):257–271CrossRefGoogle Scholar
  32. 32.
    Van Veldhuizen DA, Lamont GB (2000) On measuring multi-objective evolutionary algorithm performance. In: Proceedings of the 2000 IEEE Congress on Evolutionary Computation, CEC 2000, vol 1. IEEE Service Center, Piscataway, New Jersey, pp 204–211Google Scholar
  33. 33.
    Derrac J, Garc´ıa S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2017

Authors and Affiliations

  1. 1.College of Computer ScienceChongqing UniversityChongqingChina
  2. 2.School of Software EngineeringChongqing University of Posts and TelecommunicationsChongqingChina

Personalised recommendations