Abstract
Large-scale sparse multi-objective optimization problems (LSSMOPs) widely exist in the real world, such as portfolio optimization, neural network training problems, and so on. In recent years, a number of multi-objective optimization evolutionary algorithms (MOEAs) have been proposed to deal with LSSMOPs. To improve the search efficiency of the operator, using unsupervised neural networks to reduce the search space is one of the dimensionality reduction methods in sparse MOEAs. However, it is not efficient enough that existing algorithms using neural networks consume much time to train networks in each evolutionary generation. In addition, most sparse MOEAs ignore the relationship between binary vectors and real vectors, which determine the decision variables. Thus, this paper proposes an evolutionary algorithm for solving LSSMOPs. The proposed algorithm adopts an adaptive dimensionality reduction method to achieve a balance between convergence and efficiency. The algorithm groups the binary vectors and adaptively uses a restricted Boltzmann machine to reduce the search space of binary vectors. Then, the generation of real vectors is guided by binary vectors, which enhance the relationship between both parts of the decision variables. According to the experimental results on eight benchmark problems and neural network training problems, the proposed algorithm achieves better performance than existing state-of-the-art evolutionary algorithms for LSSMOPs.
Similar content being viewed by others
References
Kropp I, Nejadhashemi AP, Deb K (2022) Benefits of sparse population sampling in multi-objective evolutionary computing for large-scale sparse optimization problems. Swarm Evol Comput 69:101,025
Tang L, Zhang L, Luo P et al (2012) Incorporating occupancy into frequent pattern mining for high quality pattern recommendation. In: Proc. 21st ACM Int. conf. inf. knowl. manag, pp 75–84
Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: A multi-objective approach. IEEE Trans Cybern 43(6):1656–1671
Xiang Y, Zhou Y, Zheng Z et al (2017) Configuring software product lines by combining many-objective optimization and sat solvers. ACM Trans Softw Eng Methodol 26(4):1–46
Lwin K, Qu R, Kendall G (2014) A learning-guided multi-objective evolutionary algorithm for constrained portfolio optimization. Appl Soft Comput 24:757–772
Deb K, Pratap A, Agarwal S et al (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2):182–197
Zhang Q, Li H (2007) MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evol Comput 11(6):712–731
Deb K, Jain H (2014) An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: Solving problems with box constraints. IEEE Trans Evol Comput 18(4):577–601
Tian Y, Zheng X, Zhang X et al (2020) Efficient large-scale multiobjective optimization based on a competitive swarm optimizer. In: IEEE Trans Cybern, vol 50, pp 3696–3708
Tian Y, Zhang X, Wang C et al (2020) An evolutionary algorithm for large-scale sparse multiobjective optimization problems. IEEE Trans Evol Comput 24(2):380–393
Jozefowiez N, Semet F, Talbi EG (2008) Multi-objective vehicle routing problems. Eur J Oper Res 189(2):293–309
Zhao J, Xu Y, Luo F et al (2014) Power system fault diagnosis based on history driven differential evolution and stochastic time domain simulation. Inf Sci 275:13–29
Rahmani A, Mirhassani SA (2014) A hybrid firefly-genetic algorithm for the capacitated facility location problem. Inf Sci 283:70–78
Mishra SK, Panda G, Meher S (2009) Multi-objective particle swarm optimization approach to portfolio optimization. In: 2009 World Congress on Nature Biologically Inspired Computing (NaBIC), pp 1612–1615
Srivastava N, Hinton G, Krizhevsky A et al (2014) Dropout: A simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958
Tian Y, Lu C, Zhang X et al (2020) A pattern mining-based evolutionary algorithm for large-scale sparse multiobjective optimization problems. IEEE Trans Cybern, pp 1–14
Tian Y, Lu C, Zhang X et al (2021) Solving large-scale multiobjective optimization problems with sparse optimal solutions via unsupervised neural networks. IEEE Trans Cybern 51(6):3115–3128
Tian Y, Feng Y, Zhang X et al (2021) A fast clustering based evolutionary algorithm for super-large-scale sparse multi-objective optimization. IEEE/CAA Journal of Automatica Sinica
Tan Z, Wang H, Liu S (2021) Multi-stage dimension reduction for expensive sparse multi-objective optimization problems. Neurocomputing 440:159–174
Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313(5786):504–507
Zhang Y, Tian Y, Zhang X (2021) Improved sparseEA for sparse large-scale multi-objective optimization problems. Complex Intell Syst, pp 1–16
Zille H, Ishibuchi H, Mostaghim S et al (2018) A framework for large-scale multiobjective optimization based on problem transformation. IEEE Trans Evol Comput 22(2):260–275
He C, Li L, Tian Y et al (2019) Accelerating large-scale multiobjective optimization via problem reformulation. IEEE Trans Evol Comput 23(6):949–961
Ma X, Liu F, Qi Y et al (2016) A multiobjective evolutionary algorithm based on decision variable analyses for multiobjective optimization problems with large-scale variables. IEEE Trans Evol Comput 20(2):275–298
Zhang X, Tian Y, Cheng R et al (2018) A decision variable clustering-based evolutionary algorithm for large-scale many-objective optimization. IEEE Trans Evol Comput 22(1):97–112
He C, Cheng R, Yazdani D (2022) Adaptive offspring generation for evolutionary large-scale multiobjective optimization. IEEE Trans Syst, Man, Cybern Syst 52(2):786–798
Tian Y, Zheng X, Zhang X et al (2020) Efficient large-scale multiobjective optimization based on a competitive swarm optimizer. IEEE Trans Cybern 50(8):3696–3708
Lu Z, Whalen I, Boddeti V et al (2019) NSGA-NET: neural architecture search using multi-objective genetic algorithm. In: Proceedings of the genetic and evolutionary computation conference, pp 419–427
Wang X, Zhang K, Wang J et al (2021) An enhanced competitive swarm optimizer with strongly convex sparse operator for large-scale multi-objective optimization. IEEE Trans Evol Comput
Tian Y, Liu R, Zhang X et al (2021) A multipopulation evolutionary algorithm for solving large-scale multimodal multiobjective optimization problems. IEEE Trans Evol Comput 25(3):405–418
Fischer A, Igel C (2012) An introduction to restricted boltzmann machines. In: Proc. Iberoamerican congr. Pattern Recognit., Springer, pp 14–36
Zhai J, Zhang S, Chen J et al (2018) Autoencoder and its various variants. In: Proc. IEEE Int. conf. syst man cybern. IEEE, pp 415–419
Zitzler E, Laumanns M, Thiele L (2001) SPEA2: Improving the strength pareto evolutionary algorithm for multiobjective optimization. In: Proc. 5th Conf. evol. methods design optim. control appl. ind. problems, pp 95–100
Cheng R, Jin Y, Olhofer M et al (2016) A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Trans Evol Comput 20(5):773–791
Tian Y, Cheng R, Zhang X et al (2017) Platemo: a matlab platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput Intell Mag 12(4):73–87
Agrawal RB, Deb K, Agrawal RB (1994) Simulated binary crossover for continuous search space. Complex Syst 9(3):115–148
Deb K, Goyal M (1996) A combined genetic adaptive search (geneas) for engineering design. Comput Sci Informat 26(4):30–45
Zitzler E, Thiele L, Laumanns M et al (2003) Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans Evol Comput 7(2):117–132
While L, Hingston P, Barone L et al (2006) A faster algorithm for calculating hypervolume. IEEE Trans Evol Comput 10(1):29–38
Gu Q, Zhang X, Chen L et al (2021) An improved bagging ensemble surrogate-assisted evolutionary algorithm for expensive many-objective optimization. APPL INTELL, pp 1–17
Tian N, Ouyang D, Wang Y et al (2021) Core-guided method for constraint-based multi-objective combinatorial optimization. APPL INTELL 51(6):3865–3879
Acknowledgements
This work was supported by National Natural Science Foundation of China under Grant 51977100.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Geng, H., Shen, J., Zhou, Z. et al. An improved large-scale sparse multi-objective evolutionary algorithm using unsupervised neural network. Appl Intell 53, 10290–10309 (2023). https://doi.org/10.1007/s10489-022-04037-7
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-022-04037-7