Skip to main content

Investigating Innovized Progress Operators with Different Machine Learning Methods

  • Conference paper
  • First Online:
Evolutionary Multi-Criterion Optimization (EMO 2023)

Abstract

Recent studies have demonstrated that the performance of Reference vector (RV) based Evolutionary Multi- and Many-objective Optimization algorithms could be improved, through the intervention of Machine Learning (ML) methods. These studies have shown how learning efficient search directions from the intermittent generations’ solutions, could be utilized to create pro-convergence and pro-diversity offspring, leading to better convergence and diversity, respectively. The entailing steps of data-set preparation, training of ML models, and utilization of these models, have been encapsulated as Innovized Progress operators, namely, IP2 (for convergence improvement) and IP3 (for diversity improvement). Evidently, the focus in these studies has been on proof-of-the-concept, and no exploratory analysis has been done to investigate, if and how drastically the operators’ performance may be impacted, if their underlying ML methods (Random Forest for IP2, and kNN for IP3) are varied. This paper seeks to bridge this gap, through an exploratory analysis for both IP2 and IP3, based on eight different ML methods, tested against an exhaustive test suite comprising of seven multi-objective and 32 many-objective test instances. While the results broadly endorse the robustness of the existing IP2 and IP3 operators, they also reveal interesting tradeoffs across different ML methods, in terms of the Hypervolume (HV) metric and corresponding run-time. Notably, within the gambit of the considered test suite and different ML methods adopted, kNN emerges as a winner for both IP2 and IP3, based on conjunct consideration of HV metric and run-time.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    For this paper, the 21 seed runs were executed in parallel to save the overall run-time, given which the exact run-time for each seed was not traceable. Hence, for run-time estimate, only the seed corresponding to the median hypervolume was executed again.

References

  1. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001). https://doi.org/10.1023/A:1010933404324

    Article  MATH  Google Scholar 

  2. Chen, L., Liu, H.L., Tan, K.C., Cheung, Y.M., Wang, Y.: Evolutionary many-objective algorithm using decomposition-based dominance relationship. IEEE Trans. Cybern. 49(12), 4129–4139 (2019). https://doi.org/10.1109/TCYB.2018.2859171

    Article  Google Scholar 

  3. Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system, pp. 785–794 (2016). https://doi.org/10.1145/2939672.2939785

  4. Cheng, R., et al.: A benchmark test suite for evolutionary many-objective optimization. Complex Intell. Syst. 3(1), 67–81 (2017). https://doi.org/10.1007/s40747-017-0039-7

    Article  Google Scholar 

  5. Cover, T.: Estimation by the nearest neighbor rule. IEEE Trans. Inf. Theory 14(1), 50–55 (1968). https://doi.org/10.1109/TIT.1968.1054098

    Article  MathSciNet  MATH  Google Scholar 

  6. Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014). https://doi.org/10.1109/TEVC.2013.2281535

    Article  Google Scholar 

  7. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization. In: Abraham, A., Jain, L., Goldberg, R. (eds.) Evolutionary Multiobjective Optimization: Theoretical Advances and Applications, pp. 105–145. Springer, London (2005). https://doi.org/10.1007/1-84628-137-7_6

    Chapter  MATH  Google Scholar 

  8. Elarbi, M., Bechikh, S., Gupta, A., Ben Said, L., Ong, Y.: A new decomposition-based NSGA-II for many-objective optimization. IEEE Trans. Syst. Man Cybern.: Syst. 48(7), 1191–1210 (2018). https://doi.org/10.1109/TSMC.2017.2654301

    Article  Google Scholar 

  9. Goldstein, B.A., Polley, E.C., Briggs, F.B.S.: Random forests for genetic association studies. Stat. Appl. Genet. Mol. Biol. 10(1) (2011). https://doi.org/10.2202/1544-6115.1691

  10. Hussain, J.N.: High dimensional data challenges in estimating multiple linear regression. In: Journal of Physics: Conference Series, vol. 1591, no. 1, p. 012035 (2020). https://doi.org/10.1088/1742-6596/1591/1/012035

  11. Liu, J., Wang, Y., Wei, S., Wu, X., Tong, W.: A parameterless penalty rule-based fitness estimation for decomposition-based many-objective optimization evolutionary algorithm. IEEE Access 7, 81701–81716 (2019). https://doi.org/10.1109/ACCESS.2019.2920698

    Article  Google Scholar 

  12. Mittal, S., Saxena, D.K., Deb, K., Goodman, E.D.: A learning-based Innovized progress operator for faster convergence in evolutionary multi-objective optimization. ACM Trans. Evol. Learn. Optim. 2(1) (2021). https://doi.org/10.1145/3474059

  13. Mittal, S., Saxena, D.K., Deb, K., Goodman, E.D.: Enhanced Innovized progress operator for evolutionary multi- and many-objective optimization. IEEE Trans. Evol. Comput. 26(5), 961–975 (2022). https://doi.org/10.1109/TEVC.2021.3131952

    Article  Google Scholar 

  14. Mittal, S., Saxena, D.K., Deb, K., Goodman, E.D.: A unified Innovized progress operator for performance enhancement in evolutionary multi- and many-objective optimization. Technical report. 2022006, Computational Optimization and Innovation Laboratory, Michigan State University, East Lansing, MI-48824, USA (2022). https://coin-lab.org/content/publications.html

  15. Ogutu, J., Schulz-Streeck, T., Piepho, H.P.: Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions. BMC Proc. 6(2), S10 (2012). https://doi.org/10.1186/1753-6561-6-S2-S10

  16. Saxena, D.K., Kapoor, S.: On timing the nadir-point estimation and/or termination of reference-based multi- and many-objective evolutionary algorithms. In: Deb, K., et al. (eds.) EMO 2019. LNCS, vol. 11411, pp. 191–202. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-12598-1_16

    Chapter  Google Scholar 

  17. Saxena, D.K., Mittal, S., Kapoor, S., Deb, K.: A localized high-fidelity-dominance based many-objective evolutionary algorithm. IEEE Trans. Evol. Comput. 1 (2022). https://doi.org/10.1109/TEVC.2022.3188064

  18. Yuan, Y., Xu, H., Wang, B., Yao, X.: A new dominance relation-based evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 20(1), 16–37 (2016). https://doi.org/10.1109/TEVC.2015.2420112

    Article  Google Scholar 

  19. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8(2), 173–195 (2000). https://doi.org/10.1162/106365600568202

    Article  Google Scholar 

Download references

Acknowledgement

The authors wish to acknowledge Government of India, for supporting this research through an Indo-US SPARC project (code: P66). The authors also wish to thank Sukrit Mittal for his support through this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dhish Kumar Saxena .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bhasin, D., Swami, S., Sharma, S., Sah, S., Saxena, D.K., Deb, K. (2023). Investigating Innovized Progress Operators with Different Machine Learning Methods. In: Emmerich, M., et al. Evolutionary Multi-Criterion Optimization. EMO 2023. Lecture Notes in Computer Science, vol 13970. Springer, Cham. https://doi.org/10.1007/978-3-031-27250-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-27250-9_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-27249-3

  • Online ISBN: 978-3-031-27250-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics