Skip to main content

Multitask Feature Selection for Objective Reduction

  • Conference paper
  • First Online:
Evolutionary Multi-Criterion Optimization (EMO 2021)

Abstract

Objective reduction has been regarded as a major tool for solving many-objective optimization problems (MaOPs). This paper proposes a multitask feature selection method for objective reduction. In our proposed method, each objective is formulated as a positive linear combination of a small number of essential objectives, and sparse regularization is employed to identify redundant objectives. Our numerical experiment shows the effectiveness and robustness of the proposed method by comparing it with some state-of-the-art objective reduction methods.

This work was supported by the National Natural Science Foundation of China (Grant No: 61876163) and ANR/RGC Joint Research Scheme sponsored by the Research Grants Council of the Hong Kong Special Administrative Region, China and France National Research Agency (Project No: A-CityU101/16).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Argyriou, A., Evgeniou, T., Pontil, M.: Convex multi-task feature learning. Mach. Learn. 73, 243–272 (2008)

    Article  Google Scholar 

  2. Bader, J., Zitzler, E.: HypE: an algorithm for fast hypervolume-based many-objective optimization. Evol. Comput. 19(1), 45–76 (2011)

    Article  Google Scholar 

  3. Brockhoff, D., Friedrich, T., Neumann, F.: Analyzing hypervolume indicator based algorithms. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 651–660. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87700-4_65

    Chapter  Google Scholar 

  4. Brockhoff, D., Zitzler, E.: Objective reduction in evolutionary multiobjective optimization: theory and applications. Evol. Comput. 17(2), 135–166 (2009)

    Article  Google Scholar 

  5. Cheung, Y.M., Gu, F.: Online objective reduction for many-objective optimization problems. In: 2014 IEEE Congress on on Evolutionary Computation, pp. 1166–1171 (2014)

    Google Scholar 

  6. Cheung, Y.M., Gu, F., Liu, H.L.: Objective extraction for many-objective optimization problems: algorithm and test problems. IEEE Trans. Evol. Comput. 20(5), 755–772 (2016)

    Article  Google Scholar 

  7. Deb, K., Saxena, D.: Searching for pareto-optimal solutions through dimensionality reduction for certain large-dimensional multi-objective optimization problems. Technical Report. Indian Institute of Technology, Kanpur

    Google Scholar 

  8. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  9. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization (2006)

    Google Scholar 

  10. Deng, J., Zhang, Q.: Combining simple and adaptive Monte Carlo methods for approximating hypervolume. IEEE Trans. Evol. Comput. 24(5), 896–907 (2020)

    Article  Google Scholar 

  11. Gómez, R.H., Coello, C.A.C.: Mombi: a new metaheuristic for many-objective optimization based on the R2 indicator. In: 2013 IEEE Congress on Evolutionary Computation, pp. 2488–2495. IEEE (2013)

    Google Scholar 

  12. Grant, M.: CVX : Matlab software for disciplined convex programming, version 1.21 (2011). http://cvxr.com/cvx

  13. Guo, X., Wang, Y., Wang, X.: An objective reduction algorithm using representative Pareto solution search for many-objective optimization problems. Soft. Comput. 20, 4881–4895 (2016)

    Article  Google Scholar 

  14. He, Z., Yen, G.G.: Visualization and performance metric in many-objective optimization. IEEE Trans. Evol. Comput. 20(3), 386–402 (2016)

    Article  Google Scholar 

  15. Ishibuchi, H., Masuda, H., Nojima, Y.: Pareto fronts of many-objective degenerate test problems. IEEE Trans. Evol. Comput. 20(5), 807–813 (2016)

    Article  Google Scholar 

  16. Ishibuchi, H., Tsukamoto, N., Nojima, Y.: Evolutionary many-objective optimization: a short review. In: 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), pp. 2419–2426. IEEE (2008)

    Google Scholar 

  17. López Jaimes, A., Coello, C.A.C., Urías Barrientos, J.E.: Online objective reduction to deal with many-objective problems. In: Ehrgott, M., Fonseca, C.M., Gandibleux, X., Hao, J.-K., Sevaux, M. (eds.) EMO 2009. LNCS, vol. 5467, pp. 423–437. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-01020-0_34

    Chapter  Google Scholar 

  18. Li, Y., Liu, H.L., Goodman, E.D.: Hyperplane-approximation-based method for many-objective optimization problems with redundant objectives. Evol. Comput. 27(2), 313–344 (2019)

    Article  Google Scholar 

  19. Saxena, D.K., Duro, J.A., Tiwari, A., Deb, K., Zhang, Q.: Objective reduction in many-objective optimization: linear and nonlinear algorithms. IEEE Trans. Evol. Comput. 17(1), 77–99 (2013)

    Article  Google Scholar 

  20. Singh, H.K., Isaacs, A., Ray, T.: A Pareto corner search evolutionary algorithm and dimensionality reduction in many-objective optimization problems. IEEE Trans. Evol. Comput. 15(4), 539–556 (2011)

    Article  Google Scholar 

  21. Wang, R., Purshouse, R.C., Fleming, P.J.: Preference-inspired co-evolutionary algorithms using weight vectors. Eur. J. Oper. Res. 243(2), 423–441 (2015)

    Article  MathSciNet  Google Scholar 

  22. Yang, S., Li, M., Liu, X., Zheng, J.: A grid-based evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 17(5), 721–736 (2013)

    Article  Google Scholar 

  23. Yuan, Y., Ong, Y.S., Gupta, A., Xu, H.: Objective reduction in many-objective optimization: evolutionary multiobjective approaches and comprehensive analysis. IEEE Trans. Evol. Comput. 22(2), 189–210 (2018)

    Article  Google Scholar 

  24. Yuan, Y., Xu, H., Wang, B., Yao, X.: A new dominance relation-based evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 20(1), 16–37 (2015)

    Article  Google Scholar 

  25. Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 11(6), 712–731 (2007)

    Article  Google Scholar 

  26. Zhen, L., Li, M., Peng, D., Yao, X.: Objective reduction for visualising many-objective solution sets. Inf. Sci. 512, 278–294 (2020)

    Article  MathSciNet  Google Scholar 

  27. Zhou, A., Wang, Y., Zhang, J.: Objective extraction via fuzzy clustering in evolutionary many-objective optimization. Inf. Sci. 509, 343–355 (2020)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Genghui Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, G., Zhang, Q. (2021). Multitask Feature Selection for Objective Reduction. In: Ishibuchi, H., et al. Evolutionary Multi-Criterion Optimization. EMO 2021. Lecture Notes in Computer Science(), vol 12654. Springer, Cham. https://doi.org/10.1007/978-3-030-72062-9_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-72062-9_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-72061-2

  • Online ISBN: 978-3-030-72062-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics