全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

求解大型线性最小二乘问题的贪婪随机坐标下降法
A Greedy Randomized Coordinate Descent Method for Solving Large Linear Least Squares Problem

DOI: 10.12677/aam.2024.136267, PP. 2780-2790

Keywords: 最小二乘问题,贪婪随机坐标下降法,松弛因子
Least Squares Problem
, The Greedy Randomized Coordinate Descent Method, Relaxation Parameter

Full-Text   Cite this paper   Add to My Lib

Abstract:

贪婪随机坐标下降法(GRCD)是求解大型线性最小二乘问题的有效迭代方法之一。本文在GRCD算法中引入松弛因子,构造了一种含参数的贪婪随机坐标下降法。并证明了当线性最小二乘问题的系数矩阵为列满秩时该方法依期望的收敛性。数值实验表明,当选取适当的松弛因子时,该算法在迭代步数和计算时间比GRCD方法更有效。
The greedy randomized coordinate descent method (GRCD) is one of the effective iterative methods to solve large linear least squares problem. A greedy randomized coordinate descent method with parameters was constructed by introducing a relaxation parameter in the GRCD algorithm. It is also proved that the method has the expected convergence when the coefficient matrix of the linear least squares problem is of full column rank. Numerical experiments show that the proposed algorithm is more effective than the GRCD method in terms of iterative steps and calculation time when the appropriate relaxation parameter is selected.

References

[1]  Duan, L.-X. and Zhang, G.-F. (2021) Variant of Greedy Randomized Gauss-Seidel Method for Ridge Regression. Numerical Mathematics: Theory, Methods and Applications, 14, 714-737.
https://doi.org/10.4208/nmtma.oa-2020-0095
[2]  Hefny, A., Needell, D. and Ramdas, A. (2017) Rows versus Columns: Randomized Kaczmarz or Gauss-Seidel for Ridge Regression. SIAM Journal on Scientific Computing, 39, S528-S542.
https://doi.org/10.1137/16m1077891
[3]  Liu, Y. and Gu, C. (2019) Variant of Greedy Randomized Kaczmarz for Ridge Regression. Applied Numerical Mathematics, 143, 223-246.
https://doi.org/10.1016/j.apnum.2019.04.008
[4]  Byrne, C. (2003) A Unified Treatment of Some Iterative Algorithms in Signal Processing and Image Reconstruction. Inverse Problems, 20, 103-120.
https://doi.org/10.1088/0266-5611/20/1/006
[5]  Bouman, C.A. and Sauer, K. (1996) A Unified Approach to Statistical Tomography Using Coordinate Descent Optimization. IEEE Transactions on Image Processing, 5, 480-492.
https://doi.org/10.1109/83.491321
[6]  Ye, J.C., Webb, K.J., Bouman, C.A. and Millane, R.P. (1999) Optical Diffusion Tomography by Iterative-Coordinate-Descent Optimization in a Bayesian Framework. Journal of the Optical Society of America A, 16, 2400-2412.
https://doi.org/10.1364/josaa.16.002400
[7]  Chang, K.W., Hsieh, C.J. and Lin, C.J. (2008) Coordinate Descent Method for Large-Scale L2-Loss Linear Support Vector Machines. Journal of Machine Learning Research, 9, 1369-1398.
[8]  Breheny, P. and Huang, J. (2011) Coordinate Descent Algorithms for Nonconvex Penalized Regression, with Applications to Biological Feature Selection. The Annals of Applied Statistics, 5, 232-253.
https://doi.org/10.1214/10-aoas388
[9]  Canutescu, A.A. and Dunbrack, R.L. (2003) Cyclic Coordinate Descent: A Robotics Algorithm for Protein Loop Closure. Protein Science, 12, 963-972.
https://doi.org/10.1110/ps.0242703
[10]  Elad, M., Matalon, B. and Zibulevsky, M. (2007) Coordinate and Subspace Optimization Methods for Linear Least Squares with Non-Quadratic Regularization. Applied and Computational Harmonic Analysis, 23, 346-367.
https://doi.org/10.1016/j.acha.2007.02.002
[11]  Scott, J.A. and T?ma, M. (2019) Sparse Stretching for Solving Sparse-Dense Linear Least-Squares Problems. SIAM Journal on Scientific Computing, 41, A1604-A1625.
https://doi.org/10.1137/18m1181353
[12]  Scott, J. and T?ma, M. (2022) Solving Large Linear Least Squares Problems with Linear Equality Constraints. BIT Numerical Mathematics, 62, 1765-1787.
https://doi.org/10.1007/s10543-022-00930-2
[13]  Ruhe, A. (1983) Numerical Aspects of Gram-Schmidt Orthogonalization of Vectors. Linear Algebra and Its Applications, 52, 591-601.
https://doi.org/10.1016/0024-3795(83)80037-8
[14]  Strohmer, T. and Vershynin, R. (2008) A Randomized Kaczmarz Algorithm with Exponential Convergence. Journal of Fourier Analysis and Applications, 15, 262-278.
https://doi.org/10.1007/s00041-008-9030-4
[15]  Leventhal, D. and Lewis, A.S. (2010) Randomized Methods for Linear Constraints: Convergence Rates and Conditioning. Mathematics of Operations Research, 35, 641-654.
https://doi.org/10.1287/moor.1100.0456
[16]  Gower, R.M. and Richtárik, P. (2015) Randomized Iterative Methods for Linear Systems. SIAM Journal on Matrix Analysis and Applications, 36, 1660-1690.
https://doi.org/10.1137/15m1025487
[17]  Liu, Y., Jiang, X. and Gu, C. (2021) On Maximum Residual Block and Two-Step Gauss-Seidel Algorithms for Linear Least-Squares Problems. Calcolo, 58, Article No. 13.
https://doi.org/10.1007/s10092-021-00404-x
[18]  Du, K. and Sun, X. (2021) A Doubly Stochastic Block Gauss-Seidel Algorithm for Solving Linear Equations. Applied Mathematics and Computation, 408, Article 126373.
https://doi.org/10.1016/j.amc.2021.126373
[19]  Bai, Z. and Wu, W. (2019) On Greedy Randomized Coordinate Descent Methods for Solving Large Linear Least-Squares Problems. Numerical Linear Algebra with Applications, 26, e2237.
https://doi.org/10.1002/nla.2237
[20]  Davis, T.A. and Hu, Y. (2011) The University of Florida Sparse Matrix Collection. ACM Transactions on Mathematical Software, 38, Article No. 1.
https://doi.org/10.1145/2049662.2049663

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133