Abstract
The block Gauss–Seidel algorithm can significantly outperform the simple randomized Gauss–Seidel algorithm for solving overdetermined least-squares problems since it moves a large block of columns rather than a single column into working memory. Here, with the help of the maximum residual rule, we construct a two-step Gauss–Seidel (2SGS) algorithm, which selects two different columns simultaneously at each iteration. As a natural extension of the 2SGS algorithm, we further propose a multi-step Gauss–Seidel algorithm, that is, the maximum residual block Gauss–Seidel (MRBGS) algorithm for solving overdetermined least-squares problems. We prove that these two different algorithms converge to the unique solution of the overdetermined least-squares problem when its coefficient matrix is of full column rank. Numerical experiments on Gaussian models as well as on 2D image reconstruction problems, show that 2SGS is more effective than the greedy randomized Gauss–Seidel algorithm, and MRBGS apparently outperforms both the greedy and randomized block Gauss–Seidel algorithms in terms of numerical performance.
Similar content being viewed by others
References
Arioli, M., Duff, I.S.: Preconditioning linear least-squares problems by identifying a basis matrix. SIAM J. Sci. Comput. 37, S544–S561 (2015)
Auslender, A.: Optimisation M\(\acute{e}\)thodes Num\(\acute{e}\)riques. Masson, Paris (1976)
Bai, Z.-Z., Wu, W.-T.: On greedy randomized Kaczmarz method for solving large sparse linear systems. SIAM J. Sci. Comput. 40, A592–A606 (2018)
Bai, Z.-Z., Wu, W.-T.: On relaxed greedy randomized Kaczmarz method for solving large sparse linear systems. Appl. Math. Lett. 83, 21–26 (2018)
Bai, Z.-Z., Wu, W.-T.: On greedy randomized coordinate descent methods for solving large linear least-squares problems. Numer. Linear Algebra Appl. 26, 1–15 (2019)
Bertsekas, D.: Nonlinear Programming, 2nd edn. Athena Scientific, Belmont (1999)
Björck, Å.: Numerical Methods for Least Squares Problems. SIAM, Philadelphia (1996)
Cenker, C., Feichtinger, H.G., Mayer, M., Steier, H., Strohmer, T.: New variants of the POCS method using affine subspaces of finite codimension, with applications to irregular sampling. In: Proceedings of SPIE: Visual Communications and Image Processing, pp. 299–310 (1992)
Davis, T.A., Hu, Y.: The University of Florida sparse matrix collection. ACM Trans. Math. Softw. 38, 1–25 (2011)
Du, K.: Tight upper bounds for the convergence of the randomized extended Kaczmarz and Gauss-Seidel algorithms. Numer. Linear Algebra Appl. 26, e2233 (2019)
Elad, M., Matalon, B., Zibulevsky, M.: Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization. Appl. Comput. Harmonic Anal. 23, 346–367 (2007)
Elfving, T.: Block-iterative methods for consistent and inconsistent linear equations. Numer. Math. 35, 1–12 (1980)
Feichtinger, H.G., Strohmer, T.: A Kaczmarz-based approach to nonperiodic sampling on unions of rectangular lattices. In: SampTA’95: 1995 Workshop on Sampling Theory and Applications, Jurmala, Latvia, pp. 32–37(1995)
Gal\(\acute{a}\)ntai, A.: Projectors and Projection Methods, Kluwer Academic Publishers, Norwell (2004)
Hansen, P.C., Jorgensen, J.S.: AIR tools II: algebraic iterative reconstruction methods, improved implementation. Numer. Algorithms 79, 107–137 (2018)
Hansen, P.C.: Regularization tools version 4.0 for Matlab 7.3. Numer. Algorithms 46, 189–194 (2007)
Hefny, A., Needell, D., Ramdas, A.: Rows versus columns: randomized Kaczmarz or Gauss-Seidel for ridge regression. SIAM J. Sci. Comput. 39, S528–S542 (2017)
Hoyos-Idrobo, A., Weiss, P., Massire, A., Amadon, A., Boulant, N.: On variant strategies to solve the magnitude least squares optimization problem in parallel transmission pulse design and under strict SAR and power constraints. IEEE Trans. Med. Imaging 33, 739–748 (2014)
Ivanov, A.A.: Regularization Kaczmarz tools version 1.0 for Matlab, Matlabcentral Fileexchange. URL: http://www.mathworks.com/matlabcentral/fileexchange/43791
Leventhal, D., Lewis, A.S.: Randomized methods for linear constraints: convergence rates and conditioning. Math. Oper. Res. 35, 641–654 (2010)
Li, H.Y., Zhang, Y.J.: Greedy block Gauss-Seidel methods for solving large linear least squares problem. arXiv preprint arXiv:2004.02476v1 (2020)
Lin, Q., Lu, Z., Xiao, L.: An accelerated randomized proximal coordinate gradient method and its application to regularized empirical risk minimization. SIAM J. Optim. 25, 2244–2273 (2015)
Ma, A., Needell, D., Ramdas, A.: Convergence properties of the randomized extended Gauss-Seidel and Kaczmarz methods. SIAM J. Matrix Anal. Appl. 36, 1590–1604 (2015)
Ma, A., Needell, D., Ramdas, A.: Iterative methods for solving factorized linear systems. SIAM J. Matrix Anal. Appl. 39, 104–122 (2018)
Moorman, J.D., Tu, T.K., Molitor, D., Needell, D.: Randomized Kaczmarz with averaging. BIT Numer. Math. (2020). https://doi.org/10.1007/s10543-020-00824-1
Necoara, I.: Faster randomized block Kaczmarz algorithms. SIAM J. Matrix Anal. Appl. 40, 1425–1452 (2019)
Necoara, I., Richtarik, P., Patrascu, A.: Randomized projection methods for convex feasibility: conditioning and convergence rates. SIAM J. Optim. 29, 2814–2852 (2019)
Needell, D., Tropp, J.A.: Paved with good intentions: analysis of a randomized block Kaczmarz method. Linear Algebra Appl. 441, 199–221 (2014)
Needell, D., Ward, R.: Two-subspace projection method for coherent overdetermined systems. J. Fourier Anal. Appl. 19, 256–269 (2013)
Needell, D., Zhao, R., Zouzias, A.: Randomized block Kaczmarz method with projection for solving least squares. Linear Algebra Appl. 484, 322–343 (2015)
Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22, 341–362 (2012)
Rebrova, E., Needell, D.: On block Gaussian sketching for the Kaczmarz method. Numer. Algorithms (2020). https://doi.org/10.1007/s11075-020-00895-9
Richtárik, P., Takáč, M.: Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function. Math. Program. 144, 1–38 (2014)
Scott, J.A., Tuma, M.: Sparse stretching for solving sparse-dense linear least-squares problems. SIAM J. Sci. Comput. 41, A1604–A1625 (2019)
Thoppe, G., Borkar, V.S., Garg, D.: Greedy block coordinate descent (GBCD) method for high dimensional quadratic programs. arXiv preprint arXiv:1404.6635v3 (2014)
Tropp, J.A.: The random paving property for uniformly bounded matrices. Studia Mathematica 185, 67–82 (2008)
Tropp, J.A.: Column subset selection, matrix factorization, and eigenvalue optimization. In: Proceedings of the Twentieth Annual ACM-SIAM Symposium on Discrete Algorithms, SIAM, Philadelphia, PA, pp. 978–986 (2009)
Wang, C., Wu, D., Yang, K.: New decentralized positioning schemes for wireless sensor networks based on recursive least-squares optimization. IEEE Wirel. Commun. Lett. 3, 78–81 (2014)
Wu, W.: Paving the Randomized Gauss-Seidel Method, BSc Thesis, Scripps College, Claremont, California (2017)
Zhang, J.-H., Guo, J.-H.: On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems. In: Applied Numerical Mathematics (to appear)
Zhang, J.-J.: A new greedy Kaczmarz algorithm for the solution of very large linear systems. Appl. Math. Lett. 91, 207–212 (2019)
Zhang, Y.-J., Li, H.-Y.: A novel greedy Gauss-Seidel method for solving large linear least squares problem. arXiv preprint arXiv: 2004.03692v1 (2020)
Zouzias, A., Freris, N.M.: Randomized extended Kaczmarz for solving least squares. SIAM J. Matrix Anal. Appl. 34, 773–793 (2013)
Acknowledgements
This paper was supported by National Natural Science Foundation (11371243) and Key Program of Natural Science of Changzhou College of Information Technology (CXKZ201908Z).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Liu, Y., Jiang, XL. & Gu, CQ. On maximum residual block and two-step Gauss–Seidel algorithms for linear least-squares problems. Calcolo 58, 13 (2021). https://doi.org/10.1007/s10092-021-00404-x
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10092-021-00404-x
Keywords
- Block Gauss–Seidel algorithm
- Least-squares problems
- Maximum residual block Gauss–Seidel algorithm
- Image reconstruction