Skip to main content
Log in

A Correction to this article was published on 01 December 2018

This article has been updated

Abstract

In this paper, we present the Wolfe’s reduced gradient method for multiobjective (multicriteria) optimization. We precisely deal with the problem of minimizing nonlinear objectives under linear constraints and propose a reduced Jacobian method, namely a reduced gradient-like method that does not scalarize those programs. As long as there are nondominated solutions, the principle is to determine a direction that decreases all goals at the same time to achieve one of them. Following the reduction strategy, only a reduced search direction is to be found. We show that this latter can be obtained by solving a simple differentiable and convex program at each iteration. Moreover, this method is conceived to recover both the discontinuous and continuous schemes of Wolfe for the single-objective programs. The resulting algorithm is proved to be (globally) convergent to a Pareto KKT-stationary (Pareto critical) point under classical hypotheses and a multiobjective Armijo line search condition. Finally, experiment results over test problems show a net performance of the proposed algorithm and its superiority against a classical scalarization approach, both in the quality of the approximated Pareto front and in the computational effort.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Change history

  • 01 December 2018

    The contents of the paper [1] were submitted for publication in J Optim Theory Appl (JOTA) on 22 December 2016, reviewed and approved by the Editors-in-Chief.

Notes

  1. For \(y, y' \in {\mathbb {R}}^r\), \(y<y'\) (resp. \(y \le y'\)) means \(y_i<y'_i\) (resp. \(y_i \le y'_i\)) for all \(i \in \{1,\ldots ,r\}\); obviously, \(y \lneq y'\) means \(y \le y'\) and \(y\ne y'\).

  2. Pointed means \(C\cap -C = \{0\}\), \(\mathrm{int}C\) stands for the topological interior of C, and, \(y \le _C y'\) means \(y-y'\in -C\).

References

  1. Collette, Y., Siarry, P.: Multiobjective Optimization. Principles and Case Studies. Springer, Berlin (2004)

    Book  Google Scholar 

  2. Figueira, J., Greco, S., Ehrgott, M.: Multiple criteria decision analysis. In: State of the Art Surveys. Springer, New York (2005)

  3. Ehrgott, M.: Multicriteria Optimization, 2nd edn. Springer, Berlin (2005)

    MATH  Google Scholar 

  4. Deb, K.: Multi-objective Optimization Using Evolutionary Algorithm. Wiley, Chichester (2001)

    MATH  Google Scholar 

  5. Miettinen, K.: Nonlinear Multiobjective Optimization. Kluwer, Boston (1999)

    MATH  Google Scholar 

  6. Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimisation. SIAM J. Optim. 20, 602–626 (2009)

    Article  MathSciNet  Google Scholar 

  7. Fliege, J.: An efficient interior-point method for convex multicriteria optimization problems. Math. Oper. Res. 31, 825–845 (2006)

    Article  MathSciNet  Google Scholar 

  8. Eichfelder, G.: An adaptive scalarization method in multiobjective optimization. SIAM J. Optim. 19, 1694–1718 (2009)

    Article  MathSciNet  Google Scholar 

  9. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Meth. Oper. Res. 51, 479–494 (2000)

    Article  MathSciNet  Google Scholar 

  10. Graña Drummond, L.M., Iusem, A.N.: A projected gradient method for vector optimization problems. Comput. Optim. Appl. 28, 5–29 (2004)

    Article  MathSciNet  Google Scholar 

  11. García-Palomares, U.M., Burguillo-Rial, J.C., González-Castaño, F.J.: Explicit gradient information in multiobjective optimization. Oper. Res. Lett. 36, 722–725 (2008)

    Article  MathSciNet  Google Scholar 

  12. Bonnel, H., Iusem, A.N., Svaiter, B.F.: Proximal methods in vector optimization. SIAM J. Optim. 15, 953–970 (2005)

    Article  MathSciNet  Google Scholar 

  13. Wolfe, P.: Methods of nonlinear programming. In: Graves, R.L., Wolfe, P. (eds.) Recent Advances in Mathematical Programming, pp. 67–86. McGraw–Hill, New York (1963)

    MATH  Google Scholar 

  14. Bazaraa, M.S., Sherali, H.D., Shetty, C.M.: Nonlinear Programming: Theory and Algorithms, 3rd edn. Wiley, New York (2006)

    Book  Google Scholar 

  15. Luenberger, D.G., Ye, Y.: Linear and Nonlinear Programming, 3rd edn. Springer, New York (2008)

    MATH  Google Scholar 

  16. El Maghri, M.: A free reduced gradient scheme. Asian J. Math. Comput. Res. 9, 228–239 (2016)

    Google Scholar 

  17. Goh, C.J., Yang, X.Q.: Duality in Optimization and Variational Inequalities. Taylor and Francis, London (2002)

    Book  Google Scholar 

  18. Ponstein, J.: Seven types of convexity. SIAM Rev. 9, 115–119 (1967)

    Article  MathSciNet  Google Scholar 

  19. Bento, G.C., Cruz Neto, J.X., Oliveira, P.R., Soubeyran, A.: The self regulation problem as an inexact steepest descent method for multicriteria optimization. Eur. J. Oper. Res. 235, 494–502 (2014)

    Article  MathSciNet  Google Scholar 

  20. Mangasarian, O.L.: Nonlinear Programming. SIAM, Philadelphia (1994)

    Book  Google Scholar 

  21. Bento, G.C., Cruz Neto, J.X., Santos, P.S.M.: An inexact steepest descent method for multicriteria optimization on riemannian manifolds. J. Optim. Theory Appl. 159, 108–124 (2013)

    Article  MathSciNet  Google Scholar 

  22. Durea, M., Strugariu, R., Tammer, C.: Scalarization in geometric and functional vector optimization revisited. J. Optim. Theory Appl. 159, 635–655 (2013)

    Article  MathSciNet  Google Scholar 

  23. Huard, P.: Convergence of the reduced gradient method. In: Mangasarian, O.L., Meyer, R.R., Robinson, S.M. (eds.) Nonlinear Programming, pp. 29–59. Academic Press, New York (1975)

    Google Scholar 

  24. Armijo, L.: Minimization of functions having Lipschitz continuous first partial derivatives. Pac. J. Math. 16, 1–3 (1966)

    Article  MathSciNet  Google Scholar 

  25. Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8, 173–195 (2000)

    Article  Google Scholar 

  26. Bolintinéanu, S., (new name: Bonnel, H.), El Maghri, M.: Pénalisation dans l’optimisation sur l’ensemble faiblement efficient. RAIRO Oper. Res. 31, 295–310 (1997)

  27. Bolintinéanu, S., El Maghri, M.: Second-order efficiency conditions and sensitivity of efficient points. J. Optim. Theory Appl. 98, 569–592 (1998)

    Article  MathSciNet  Google Scholar 

  28. El Maghri, M., Bernoussi, B.: Pareto optimizing and Kuhn–Tucker stationary sequences. Numer. Funct. Anal. Optim. 28, 287–305 (2007)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors are thankful for the valuable suggestions of the editor Dr. Franco Giannessi and the reviewers that helped improve the quality of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mounir El Maghri.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Maghri, M.E., Elboulqe, Y. Reduced Jacobian Method. J Optim Theory Appl 179, 917–943 (2018). https://doi.org/10.1007/s10957-018-1362-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-018-1362-x

Keywords

Mathematics Subject Classification

Navigation