Skip to main content
Log in

A Non-monotone Adaptive Scaled Gradient Projection Method for Orthogonality Constrained Problems

  • Original Paper
  • Published:
International Journal of Applied and Computational Mathematics Aims and scope Submit manuscript

Abstract

Optimization problems with orthogonality constraints are classical nonconvex nonlinear problems and have been widely applied in science and engineering. In order to solve this problem, we come up with an adaptive scaled gradient projection method. The method combines a scaling matrix that depends on the step size with some parameters that control the search direction. In addition, we consider the BB step size and combine a nonmonotone line search technique to accelerate the convergence speed of the proposed algorithm. Under the premise of non-monotonic, we prove the convergence of the algorithm. Also, the computation results proved the efficiency of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Fig. 1
Algorithm 2

Similar content being viewed by others

Data Availability

Not Applicable.

References

  1. Stiefel, E.: Richtungsfelder und Fernparallelismus in n-dimensionalen Mannigfaltigkeiten. Commentarii Mathe. Helvetici 8(1), 305–353 (1935)

    Article  MathSciNet  Google Scholar 

  2. Elden, L., Park, H.: A Procrustes problem on the Stiefel manifold. Numer. Math. 82(4), 599–619 (1999)

    Article  MathSciNet  Google Scholar 

  3. Schonemann, P.H.: A generalized solution of the orthogonal Procrustes problem. Psychometrika 31(1), 1–10 (1966)

    Article  MathSciNet  Google Scholar 

  4. Saad, Y.: Numerical Methods for Large Eigenvalue Problems. Manchester University Press, Manchester (1992)

    Google Scholar 

  5. Wen, Z., Yang, C., Liu, X., et al.: Trace-penalty minimization for large-scale eigenspace computation. J. Sci. Comput. 66(3), 1175–1203 (2016)

    Article  MathSciNet  Google Scholar 

  6. Kokiopoulou, E., Chen, J., Saad, Y.: Trace optimization and eigenproblems in dimension reduction methods. Numer. Linear. Algeb. Appl. 18(3), 565–602 (2011)

    Article  MathSciNet  Google Scholar 

  7. Zhaosong, L., Zhang, Y.: An augmented Lagrangian approach for sparse principal component analysis. Math. Program. 135(1–2), 149–193 (2012)

    MathSciNet  Google Scholar 

  8. Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. J. Comput. Graph. Stat. 15(2), 265–286 (2006)

    Article  MathSciNet  Google Scholar 

  9. Theis, F.J., Cason, T.P., Absil, P.A.: Soft dimension reduction for ICA by joint diagonalization on the Stiefel manifold. Indep. Compon. Anal. Sign. Sep. 5441, 354–361 (2009)

    Google Scholar 

  10. Manton, J.H.: Optimization algorithms exploiting unitary constraints. IEEE Trans. Sign. Process. 50(3), 635–650 (2002)

    Article  MathSciNet  Google Scholar 

  11. Nishimori, Y., Akaho, S.: Learning algorithms utilizing quasi-geodesic flows on the Stiefel manifold. Neurocomputing 67, 106–135 (2005)

    Article  Google Scholar 

  12. Abrudan, T.E., Eriksson, J., Koivunen, V.: Steepest descent algorithms for optimization under unitary matrix constraint. IEEE Trans. Sign. Process. 56(3), 1134–1147 (2008)

    Article  MathSciNet  Google Scholar 

  13. Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20(2), 303–353 (1998)

    Article  MathSciNet  Google Scholar 

  14. Abrudan, T., Eriksson, J., Koivunen, V.: Conjugate gradient algorithm for optimization under unitary matrix constraint. Sign. Process. 89(9), 1704–1714 (2009)

    Article  Google Scholar 

  15. Absil, P.A., Baker, C.G., Gallivan, K.A.: Trust-region methods on Riemannian manifolds. Found. Comput. Math. 7(3), 303–330 (2007)

    Article  MathSciNet  Google Scholar 

  16. Savas, B., Lim, L.H.: Quasi-Newton methods on Grassmannians and multilinear approximations of tensors. SIAM J. Sci. Comput. 32(6), 3352–3393 (2010)

    Article  MathSciNet  Google Scholar 

  17. Absil, P.A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press (2009)

    Google Scholar 

  18. Absil, P.A., Malick, J.: Projection-like retractions on matrix manifolds. SIAM J. Optim. 22(1), 135–158 (2012)

    Article  MathSciNet  Google Scholar 

  19. Gao, B., Liu, X., Chen, X., et al.: A new first-order framework for orthogonal constrained optimization problems. Optimization 28, 302–332 (2016)

    Google Scholar 

  20. Zhu, X.J.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67(1), 73–110 (2017)

    Article  MathSciNet  Google Scholar 

  21. Oviedo, H., Dalmau, O., Lara, H.: Two adaptive scaled gradient projection methods for Stiefel manifold constrained optimization. Num. Algorithm. 87, 1107–1127 (2020)

    Article  MathSciNet  Google Scholar 

  22. Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Math. Program. 142(1–2), 397–434 (2013)

    Article  MathSciNet  Google Scholar 

  23. Chamberlain, R.M., Powell, M.J.D.: The watchdog technique for forcing conver-gence in algorithm for constrained optimization. Math. Program. Stud. 16, 1–17 (1982)

    Article  Google Scholar 

  24. Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MathSciNet  Google Scholar 

  25. Zhang, H., Hager, W.W.A.: nonmonotone line search technique and its application to unconstrained optimization. Soc. Ind. Appl. Math. 14(4), 1043–1056 (2004)

    MathSciNet  Google Scholar 

  26. Gu, Z., Mo, J.: Incorporating nonmonotone strategies into the trust region method for unconstrained optimization. Comput. Math. Appl. 55(9), 2158–2172 (2008)

    Article  MathSciNet  Google Scholar 

  27. Nishimori, Y., Akaho, S.: Learning algorithms utilizing quasi-geodesic flows on the Stiefel manifold. Neurocomputing 67(none), 106–135 (2005)

    Article  Google Scholar 

  28. Oviedo, H., Lara, H., Dalmau, O.: A non-monotone linear search algorithm with mixed direction on Stiefel manifold. Optim. Methods Softw. 34(2), 437–457 (2019)

    Article  MathSciNet  Google Scholar 

  29. Boufounos, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  30. Hu, J., Liu, X., Wen, Z., Yuan, Y.: A brief introduction to manifold optimization. J. Oper. Res. Soc. China 8, 199–248 (2020)

    Article  MathSciNet  Google Scholar 

  31. Zhu, X.: A Riemannian conjugate gradient method for optimization on the Stiefel manifold. Comput. Optim. Appl. 67, 73–110 (2017)

    Article  MathSciNet  Google Scholar 

  32. Oviedo, H., Dalmau, O.: A scaled gradient projection method for minimization over the Stiefel manifold. In: International Conference on Artificial Intelligence, pp. 239–250, Springer, Cham (2019).

  33. Sun, Y., Huang, Y.: A Riemannian conjugate gradient method on stiefel manifold. 44(3), 255–266 (2022)

Download references

Acknowledgements

The authors would be grateful for the comments and suggestions proposed by anonymous reviewers.

Funding

This work was funded by National Science Foundation of China (12171042).

Author information

Authors and Affiliations

Authors

Contributions

Q wrote and debug the code for the algorithm. The main idea of this work is put forward by Q. Both authors wrote and revised the manuscript together.

Corresponding author

Correspondence to Qinghua Zhou.

Ethics declarations

Conflict of interest

The author's declared that they have no conflict of interest.

Ethical Approval

Not Applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ji, Q., Zhou, Q. A Non-monotone Adaptive Scaled Gradient Projection Method for Orthogonality Constrained Problems. Int. J. Appl. Comput. Math 10, 89 (2024). https://doi.org/10.1007/s40819-024-01689-6

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40819-024-01689-6

Keywords

Navigation