Skip to main content
Log in

An improved large-scale sparse multi-objective evolutionary algorithm using unsupervised neural network

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Large-scale sparse multi-objective optimization problems (LSSMOPs) widely exist in the real world, such as portfolio optimization, neural network training problems, and so on. In recent years, a number of multi-objective optimization evolutionary algorithms (MOEAs) have been proposed to deal with LSSMOPs. To improve the search efficiency of the operator, using unsupervised neural networks to reduce the search space is one of the dimensionality reduction methods in sparse MOEAs. However, it is not efficient enough that existing algorithms using neural networks consume much time to train networks in each evolutionary generation. In addition, most sparse MOEAs ignore the relationship between binary vectors and real vectors, which determine the decision variables. Thus, this paper proposes an evolutionary algorithm for solving LSSMOPs. The proposed algorithm adopts an adaptive dimensionality reduction method to achieve a balance between convergence and efficiency. The algorithm groups the binary vectors and adaptively uses a restricted Boltzmann machine to reduce the search space of binary vectors. Then, the generation of real vectors is guided by binary vectors, which enhance the relationship between both parts of the decision variables. According to the experimental results on eight benchmark problems and neural network training problems, the proposed algorithm achieves better performance than existing state-of-the-art evolutionary algorithms for LSSMOPs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Fig. 3
Fig. 4
Algorithm 2
Fig. 5
Algorithm 3
Algorithm 4
Algorithm 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Kropp I, Nejadhashemi AP, Deb K (2022) Benefits of sparse population sampling in multi-objective evolutionary computing for large-scale sparse optimization problems. Swarm Evol Comput 69:101,025

    Article  Google Scholar 

  2. Tang L, Zhang L, Luo P et al (2012) Incorporating occupancy into frequent pattern mining for high quality pattern recommendation. In: Proc. 21st ACM Int. conf. inf. knowl. manag, pp 75–84

  3. Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE Trans Cybern 43(6):1656–1671

    Article  Google Scholar 

  4. Xiang Y, Zhou Y, Zheng Z et al (2017) Configuring software product lines by combining many-objective optimization and sat solvers. ACM Trans Softw Eng Methodol 26(4):1–46

    Article  Google Scholar 

  5. Lwin K, Qu R, Kendall G (2014) A learning-guided multi-objective evolutionary algorithm for constrained portfolio optimization. Appl Soft Comput 24:757–772

    Article  Google Scholar 

  6. Deb K, Pratap A, Agarwal S et al (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197

    Article  Google Scholar 

  7. Zhang Q, Li H (2007) MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evol Comput 11(6):712–731

    Article  Google Scholar 

  8. Deb K, Jain H (2014) An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: Solving problems with box constraints. IEEE Trans Evol Comput 18(4):577–601

    Article  Google Scholar 

  9. Tian Y, Zheng X, Zhang X et al (2020) Efficient large-scale multiobjective optimization based on a competitive swarm optimizer. In: IEEE Trans Cybern, vol 50, pp 3696–3708

  10. Tian Y, Zhang X, Wang C et al (2020) An evolutionary algorithm for large-scale sparse multiobjective optimization problems. IEEE Trans Evol Comput 24(2):380–393

    Article  Google Scholar 

  11. Jozefowiez N, Semet F, Talbi EG (2008) Multi-objective vehicle routing problems. Eur J Oper Res 189(2):293–309

    Article  MathSciNet  MATH  Google Scholar 

  12. Zhao J, Xu Y, Luo F et al (2014) Power system fault diagnosis based on history driven differential evolution and stochastic time domain simulation. Inf Sci 275:13–29

    Article  MathSciNet  Google Scholar 

  13. Rahmani A, Mirhassani SA (2014) A hybrid firefly-genetic algorithm for the capacitated facility location problem. Inf Sci 283:70–78

    Article  MathSciNet  MATH  Google Scholar 

  14. Mishra SK, Panda G, Meher S (2009) Multi-objective particle swarm optimization approach to portfolio optimization. In: 2009 World Congress on Nature Biologically Inspired Computing (NaBIC), pp 1612–1615

  15. Srivastava N, Hinton G, Krizhevsky A et al (2014) Dropout: A simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  16. Tian Y, Lu C, Zhang X et al (2020) A pattern mining-based evolutionary algorithm for large-scale sparse multiobjective optimization problems. IEEE Trans Cybern, pp 1–14

  17. Tian Y, Lu C, Zhang X et al (2021) Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks. IEEE Trans Cybern 51(6):3115–3128

    Article  Google Scholar 

  18. Tian Y, Feng Y, Zhang X et al (2021) A fast clustering based evolutionary algorithm for super-large-scale sparse multi-objective optimization. IEEE/CAA Journal of Automatica Sinica

  19. Tan Z, Wang H, Liu S (2021) Multi-stage dimension reduction for expensive sparse multi-objective optimization problems. Neurocomputing 440:159–174

    Article  Google Scholar 

  20. Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313(5786):504–507

    Article  MathSciNet  MATH  Google Scholar 

  21. Zhang Y, Tian Y, Zhang X (2021) Improved sparseEA for sparse large-scale multi-objective optimization problems. Complex Intell Syst, pp 1–16

  22. Zille H, Ishibuchi H, Mostaghim S et al (2018) A framework for large-scale multiobjective optimization based on problem transformation. IEEE Trans Evol Comput 22(2):260–275

    Article  Google Scholar 

  23. He C, Li L, Tian Y et al (2019) Accelerating large-scale multiobjective optimization via problem reformulation. IEEE Trans Evol Comput 23(6):949–961

    Article  Google Scholar 

  24. Ma X, Liu F, Qi Y et al (2016) A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with large-scale variables. IEEE Trans Evol Comput 20(2):275–298

    Article  Google Scholar 

  25. Zhang X, Tian Y, Cheng R et al (2018) A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization. IEEE Trans Evol Comput 22(1):97–112

    Article  Google Scholar 

  26. He C, Cheng R, Yazdani D (2022) Adaptive offspring generation for evolutionary large-scale multiobjective optimization. IEEE Trans Syst, Man, Cybern Syst 52(2):786–798

    Article  Google Scholar 

  27. Tian Y, Zheng X, Zhang X et al (2020) Efficient large-scale multiobjective optimization based on a competitive swarm optimizer. IEEE Trans Cybern 50(8):3696–3708

    Article  Google Scholar 

  28. Lu Z, Whalen I, Boddeti V et al (2019) NSGA-NET: neural architecture search using multi-objective genetic algorithm. In: Proceedings of the genetic and evolutionary computation conference, pp 419–427

  29. Wang X, Zhang K, Wang J et al (2021) An enhanced competitive swarm optimizer with strongly convex sparse operator for large-scale multi-objective optimization. IEEE Trans Evol Comput

  30. Tian Y, Liu R, Zhang X et al (2021) A multipopulation evolutionary algorithm for solving large-scale multimodal multiobjective optimization problems. IEEE Trans Evol Comput 25(3):405–418

    Article  Google Scholar 

  31. Fischer A, Igel C (2012) An introduction to restricted boltzmann machines. In: Proc. Iberoamerican congr. Pattern Recognit., Springer, pp 14–36

  32. Zhai J, Zhang S, Chen J et al (2018) Autoencoder and its various variants. In: Proc. IEEE Int. conf. syst man cybern. IEEE, pp 415–419

  33. Zitzler E, Laumanns M, Thiele L (2001) SPEA2: Improving the strength pareto evolutionary algorithm for multiobjective optimization. In: Proc. 5th Conf. evol. methods design optim. control appl. ind. problems, pp 95–100

  34. Cheng R, Jin Y, Olhofer M et al (2016) A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Trans Evol Comput 20(5):773–791

    Article  Google Scholar 

  35. Tian Y, Cheng R, Zhang X et al (2017) Platemo: a matlab platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput Intell Mag 12(4):73–87

    Article  Google Scholar 

  36. Agrawal RB, Deb K, Agrawal RB (1994) Simulated binary crossover for continuous search space. Complex Syst 9(3):115–148

    MathSciNet  MATH  Google Scholar 

  37. Deb K, Goyal M (1996) A combined genetic adaptive search (geneas) for engineering design. Comput Sci Informat 26(4):30–45

    Google Scholar 

  38. Zitzler E, Thiele L, Laumanns M et al (2003) Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans Evol Comput 7(2):117–132

    Article  Google Scholar 

  39. While L, Hingston P, Barone L et al (2006) A faster algorithm for calculating hypervolume. IEEE Trans Evol Comput 10(1):29–38

    Article  Google Scholar 

  40. Gu Q, Zhang X, Chen L et al (2021) An improved bagging ensemble surrogate-assisted evolutionary algorithm for expensive many-objective optimization. APPL INTELL, pp 1–17

  41. Tian N, Ouyang D, Wang Y et al (2021) Core-guided method for constraint-based multi-objective combinatorial optimization. APPL INTELL 51(6):3865–3879

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China under Grant 51977100.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junye Shen.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Geng, H., Shen, J., Zhou, Z. et al. An improved large-scale sparse multi-objective evolutionary algorithm using unsupervised neural network. Appl Intell 53, 10290–10309 (2023). https://doi.org/10.1007/s10489-022-04037-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-04037-7

Keywords

Navigation