Skip to main content

A Surrogate-Assisted Evolutionary Algorithm with Random Feature Selection for Large-Scale Expensive Problems

  • Conference paper
  • First Online:
Parallel Problem Solving from Nature – PPSN XVI (PPSN 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12269))

Included in the following conference series:

Abstract

When optimizing large-scale problems an evolutionary algorithm typically requires a substantial number of fitness evaluations to discover a good approximation to the global optimum. This is an issue when the problem is also computationally expensive. Surrogate-assisted evolutionary algorithms have shown better performance on high-dimensional problems which are no larger than 200 dimensions. However, it is very difficult to train sufficiently accurate surrogate models for a large-scale optimization problem due to the lack of training data. In this paper, a random feature selection technique is utilized to select decision variables from the original large-scale optimization problem to form a number of sub-problems, whose dimension may differ to each other, at each generation. The population employed to optimize the original large-scale optimization problem is updated by sequentially optimizing each sub-problem assisted by a surrogate constructed for this sub-problem. A new candidate solution of the original problem is generated by replacing the decision variables of the best solution found so far with those of the sub-problem that has achieved the best approximated fitness among all sub-problems. This new solution is then evaluated using the original expensive problem and used to update the best solution. In order to evaluate the performance of the proposed method, we conduct the experiments on 15 CEC’2013 benchmark problems and compare to some state-of-the-art algorithms. The experimental results show that the proposed method is more effective than the state-of-the-art algorithms, especially on problems that are partially separable or non-separable.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Cai, X., Gao, L., Li, X., Qiu, H.: Surrogate-guided differential evolution algorithm for high dimensional expensive problems. Swarm Evol. Comput. 48, 288–311 (2019)

    Article  Google Scholar 

  2. Cheng, R., Jin, Y.: A competitive swarm optimizer for large scale optimization. IEEE Trans. Cybern. 45(2), 191–204 (2015)

    Article  Google Scholar 

  3. Cheng, R., Jin, Y.: A social learning particle swarm optimization algorithm for scalable optimization. Inf. Sci. 291, 43–60 (2015)

    Article  MathSciNet  Google Scholar 

  4. Deb, K., Myburgh, C.: Breaking the billion-variable barrier in real-world optimization using a customized evolutionary algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 653–660 (2016)

    Google Scholar 

  5. Deb, K., Reddy, A.R., Singh, G.: Optimal scheduling of casting sequence using genetic algorithms. Mater. Manuf. Process. 18(3), 409–432 (2003)

    Article  Google Scholar 

  6. Falco, I.D., Cioppa, A.D., Trunfio, G.A.: Large scale optimization of computationally expensive functions. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 1788–1795. ACM Press (2017)

    Google Scholar 

  7. Falco, I.D., Cioppa, A.D., Trunfio, G.A.: Investigating surrogate-assisted cooperative coevolution for large-scale global optimization. Inf. Sci. 482, 1–26 (2019)

    Article  Google Scholar 

  8. Ge, Y.F., et al.: Distributed differential evolution based on adaptive mergence and split for large-scale optimization. IEEE Trans. Cybern. 48(7), 2166–2180 (2017)

    Article  Google Scholar 

  9. Gutmann, H.M.: A radial basis function method for global optimization. J. Glob. Optim. 19(3), 201–227 (2001)

    Article  MathSciNet  Google Scholar 

  10. Hamody, S.F., Adra, A.I.: A hybrid multi-objective evolutionary algorithm using an inverse neural network for aircraft control system design. In: Proceedings of the IEEE Congress on Evolutionary Computation (2005)

    Google Scholar 

  11. Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2003)

    Article  Google Scholar 

  12. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002)

    Article  Google Scholar 

  13. Kuhn, M., Johnson, K.: Applied Predictive Modeling, vol. 26. Springer, Heidelberg (2013)

    Book  Google Scholar 

  14. Li, X., Tang, K., Omidvar, M.N., Yang, Z., Qin, K.: Benchmark functions for the CEC 2013 special session and competition on large-scale global optimization. Evolutionary Computation and Machine Learning Group, RMIT University, Australia, Technical report (2013)

    Google Scholar 

  15. Liu, Y., Yao, X., Zhao, Q., Higuchi, T.: Scaling up fast evolutionary programming with cooperative coevolution. In: Proceedings of the 2001 IEEE Congress on Evolutionary Computation (2002)

    Google Scholar 

  16. Omidvar, M., Li, X., Mei, Y., Yao, X.: Cooperative co-evolution with differential grouping for large scale optimization. IEEE Trans. Evol. Comput. 18(3), 378–393 (2014)

    Article  Google Scholar 

  17. Omidvar, M.N., Li, X., Yang, Z., Yao, X.: Cooperative co-evolution for large scale optimization through more frequent random grouping. In: Proceedings of 2010 IEEE Congress on Evolutionary Computation, pp. 1–8. IEEE (2010)

    Google Scholar 

  18. Omidvar, M.N., Yang, M., Mei, Y., Li, X., Yao, X.: DG2: a faster and more accurate differential grouping for large-scale black-box optimization. IEEE Trans. Evol. Comput. 21(6), 929–942 (2017)

    Article  Google Scholar 

  19. Potter, M.A., Jong, K.A.D.: A cooperative coevolutionary approach to function optimization. Third Parallel Probl. Sol. Form Nat. 866, 249–257 (1994)

    Article  Google Scholar 

  20. Ren, Z., et al.: Surrogate model assisted cooperative coevolution for large scale optimization. Appl. Intell. 49(2), 513–531 (2019)

    Article  Google Scholar 

  21. Sun, C., Ding, J., Zeng, J., Jin, Y.: A fitness approximation assisted competitive swarm optimizer for large scale expensive optimization problems. Memetic Comput. 10(2), 123–134 (2016)

    Article  Google Scholar 

  22. Sun, C., Jin, Y., Cheng, R., Ding, J., Zeng, J.: Surrogate-assisted cooperative swarm optimization of high-dimensional expensive problems. IEEE Trans. Evol. Comput. 21(4), 644–660 (2017)

    Article  Google Scholar 

  23. Sun, Y., Kirley, M., Halgamuge, S.K.: A recursive decomposition method for large scale continuous optimization. IEEE Trans. Evol. Comput. 22(5), 647–661 (2018)

    Article  Google Scholar 

  24. Sun, Y., Omidvar, M.N., Kirley, M., Li, X.: Adaptive threshold parameter estimation with recursive differential grouping for problem decomposition. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 889–896. ACM Press (2018)

    Google Scholar 

  25. Tanabe, R., Fukunaga, A.: Success-history based parameter adaptation for differential evolution. In: Proceedings of the 2013 IEEE Congress on Evolutionary Computation, pp. 71–78. IEEE (2013)

    Google Scholar 

  26. Tang, Y., Chen, J., Wei, J.: A surrogate-based particle swarm optimization algorithm for solving optimization problems with expensive black box functions. Eng. Optim. 45(5), 557–576 (2013)

    Article  MathSciNet  Google Scholar 

  27. Tian, J., Tan, Y., Zeng, J., Sun, C., Jin, Y.: Multiobjective infill criterion driven Gaussian process-assisted particle swarm optimization of high-dimensional expensive problems. IEEE Trans. Evol. Comput. 23(3), 459–472 (2018)

    Article  Google Scholar 

  28. Wang, H., Jin, Y., Sun, C., Doherty, J.: Offline data-driven evolutionary optimization using selective surrogate ensembles. IEEE Trans. Evol. Comput. 23(2), 203–216 (2018)

    Article  Google Scholar 

  29. Waske, B., van der Linden, S., Benediktsson, J.A., Rabe, A., Hostert, P.: Sensitivity of support vector machines to random feature selection in classification of hyperspectral data. IEEE Trans. Geosci. Rem. Sens. 48(7), 2880–2889 (2010)

    Article  Google Scholar 

  30. Yang, J.M., Kuo, B.C., Yu, P.T., Chuang, C.H.: A dynamic subspace method for hyperspectral image classification. IEEE Trans. Geosci. Rem. Sens. 48(7), 2840–2853 (2010)

    Article  Google Scholar 

  31. Yang, Q., Chen, W.N., Da Deng, J., Li, Y., Gu, T., Zhang, J.: A level-based learning swarm optimizer for large-scale optimization. IEEE Trans. Evol. Comput. 22(4), 578–594 (2017)

    Article  Google Scholar 

  32. Yang, Z., Tang, K., Yao, X.: Large scale evolutionary optimization using cooperative coevolution. Inf. Sci. 178(15), 2985–2999 (2008)

    Article  MathSciNet  Google Scholar 

  33. Yu, H., Tan, Y., Zeng, J., Sun, C., Jin, Y.: Surrogate-assisted hierarchical particle swarm optimization. Inf. Sci. 454, 59–72 (2018)

    Article  MathSciNet  Google Scholar 

  34. Yuan, S., Kirley, M., Halgamuge, S.K.: Extended differential grouping for large scale global optimization with direct and indirect variable interactions. In: Proceedings of the Genetic and Evolutionary Computation Conference (2015)

    Google Scholar 

  35. Zhang, J., Sanderson, A.C.: JADE: adaptive differential evolution with optional external archive. IEEE Trans. Evol. Comput. 13(5), 945–958 (2009)

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Professor Jonathan E. Fieldsend in University of Exeter for his work on improving the quality of the paper. This work was supported in part by National Natural Science Foundation of China (Grant No. 61876123), Natural Science Foundation of Shanxi Province (201801D121131, 201901D111264, 201901D111262), Shanxi Science and Technology Innovation project for Excellent Talents (201805D211028), the Doctoral Scientific Research Foundation of Taiyuan University of Science and Technology (20162029), and the China Scholarship Council (CSC).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chaoli Sun .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Fu, G., Sun, C., Tan, Y., Zhang, G., Jin, Y. (2020). A Surrogate-Assisted Evolutionary Algorithm with Random Feature Selection for Large-Scale Expensive Problems. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-58112-1_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-58111-4

  • Online ISBN: 978-3-030-58112-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics