全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于ALM的非精确加速算法
Inexact Acceleration Algorithm Based on ALM

DOI: 10.12677/aam.2025.144137, PP. 33-43

Keywords: 增广拉格朗日乘子法,Nesterov加速,对偶理论
ALM
, Nesterov Acceleration, Duality Theory

Full-Text   Cite this paper   Add to My Lib

Abstract:

增广拉格朗日乘子法为经典有效的解决线性等式约束凸优化问题的一阶优化方法,算法通过原变量与对偶变量的交替迭代更新收敛至最优点。然而,子问题中原变量的更新在实际应用中往往无法精确求解。本文基于增广拉格朗日乘子法、对偶优化以及Nesterov加速技巧,提出一种非精确求解的增广拉格朗日乘子法,利用KKT条件从对偶残差的角度分析并从理论上证明该算法的收敛速率可达到 O( 1/ k 2 )
The Augmented Lagrangian Method is a classical and effective first-order optimization technique for solving convex optimization problems with linear equality constraints. The algorithm converges to the optimal solution through alternating iterative updates between the primal and dual variables. However, in practical applications, the update of the primal variables in the subproblem is often not solved exactly. In this paper, based on the Augmented Lagrangian Method, dual optimization, and Nesterov’s acceleration technique, we propose an inexact solution version of the Augmented Lagrangian Method. By leveraging the Karush-Kuhn-Tucker (KKT) conditions, we analyze and prove that the convergence rate of the proposed algorithm can achieve a rate of O( 1/ k 2 ) .

References

[1]  Figueiredo, M.A.T. and Bioucas-Dias, J.M. (2010) Restoration of Poissonian Images Using Alternating Direction Optimization. IEEE Transactions on Image Processing, 19, 3133-3145.
https://doi.org/10.1109/tip.2010.2053941
[2]  Yang, J. and Zhang, Y. (2011) Alternating Direction Algorithms for 1-Problems in Compressive Sensing. SIAM Journal on Scientific Computing, 33, 250-278.
https://doi.org/10.1137/090777761
[3]  Yang, J., Zhang, Y. and Yin, W. (2010) A Fast Alternating Direction Method for TVL1-L2 Signal Reconstruction from Partial Fourier Data. IEEE Journal of Selected Topics in Signal Processing, 4, 288-297.
https://doi.org/10.1109/jstsp.2010.2042333
[4]  Rudin, L.I., Osher, S. and Fatemi, E. (1992) Nonlinear Total Variation Based Noise Removal Algorithms. Physica D: Nonlinear Phenomena, 60, 259-268.
https://doi.org/10.1016/0167-2789(92)90242-f
[5]  Liu, Z., Li, J., Li, G., Bai, J. and Liu, X. (2017) A New Model for Sparse and Low-Rank Matrix Decomposition. Journal of Applied Analysis & Computation, 7, 600-616.
https://doi.org/10.11948/2017037
[6]  Bai, J., Hager, W.W. and Zhang, H. (2022) An Inexact Accelerated Stochastic ADMM for Separable Convex Optimization. Computational Optimization and Applications, 81, 479-518.
https://doi.org/10.1007/s10589-021-00338-8
[7]  Nesterov, Y. (1983) A Method for Unconstrained Convex Minimization Problem with the Rate of Convergence O(1/k2). Doklady Akademii Nauk, 269, 543.
[8]  Beck, A. and Teboulle, M. (2009) A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems. SIAM Journal on Imaging Sciences, 2, 183-202.
https://doi.org/10.1137/080716542
[9]  Hestenes, M.R. (1969) Multiplier and Gradient Methods. Journal of Optimization Theory and Applications, 4, 303-320.
https://doi.org/10.1007/bf00927673
[10]  Powell, M.J.D. (1969) A Method for Nonlinear Constraints in Minimization Problems. In: Fletcher, R., Ed., Optimization, Academic Press, 283-298.
[11]  Bertsekas, D.P. (2014) Constrained Optimization and Lagrange Multiplier Methods. Academic Press.
[12]  Bertsekas, D.P. (1997) Nonlinear Programming. Journal of the Operational Research Society, 48, 334-334.
https://doi.org/10.1057/palgrave.jors.2600425
[13]  Nocedal, J. and Wright, S.J. (1999) Numerical Optimization. Springer.
[14]  He, B. and Yuan, X. (2010) On the Acceleration of Augmented Lagrangian Method for Linearly Constrained Optimization. Optimization Online.
[15]  Ke, Y. and Ma, C. (2017) An Accelerated Augmented Lagrangian Method for Linearly Constrained Convex Programming with the Rate of Convergence O(1/k2). Applied MathematicsA Journal of Chinese Universities, 32, 117-126.
https://doi.org/10.1007/s11766-017-3381-z

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133