%0 Journal Article %T 求解约束极小极大问题的隐式梯度加速方法
Accelerated Implicit Gradient-Based Methods for Solving Constrained Minimax Problems %A 胡清莹 %J Advances in Applied Mathematics %P 1035-1050 %@ 2324-8009 %D 2025 %I Hans Publishing %R 10.12677/aam.2025.144225 %X 求解约束极小极大问题的隐式梯度(GBAL)算法基本思路是,采用增广拉格朗日方法处理内层优化问题,再利用隐式梯度信息对外部变量进行迭代更新。在此基础上,本文提出了一种求解约束极小极大问题的隐式梯度加速算法,通过引入Nesterov加速梯度算法的一个变体算法更新外部变量来提升算法性能。理论分析表明,在内层问题解映射满足Lipschitz连续性且目标函数对外层变量为凸的条件下,所提出的加速算法实现了R-线性收敛速率,通过数值实验验证,加速算法在计算效率和收敛性方面均展现出优越性能。
The fundamental approach of the Implicit Gradient-Based (GBAL) algorithm for solving constrained minimax problems involves using the augmented Lagrangian method to address the inner optimization problem, followed by iterative updates of the external variables utilizing implicit gradient information. Building upon this, this paper introduces an accelerated implicit gradient algorithm for solving constrained minimax problems, which enhances the algorithm’s performance by incorporating a variant of the Nesterov accelerated gradient algorithm to update the external variables. Theoretical analysis demonstrates that under the conditions where the solution mapping of the inner problem satisfies Lipschitz continuity and the objective function is convex with respect to the outer variables, the proposed accelerated algorithm achieves an R-linear convergence rate. Numerical experiments confirm that the accelerated algorithm exhibits superior performance in terms of computational efficiency and convergence. %K 极小极大优化, %K 非线性约束, %K 基于梯度方法, %K Nesterov加速梯度算法
Minimax Optimization %K Nonlinear Constraints %K Gradient-Based Methods %K Nesterov Accelerated Gradient Algorithm %U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=113399