%0 Journal Article
%T 非凸优化问题的两步正则化牛顿法
The Two-Step Regularized Newton Method for Non-Convex Optimization Problems
%A 朱俊霖
%J Advances in Applied Mathematics
%P 3651-3664
%@ 2324-8009
%D 2023
%I Hans Publishing
%R 10.12677/AAM.2023.128363
%X 本文提出了非凸的无约束优化问题的一种在信赖域框架下的两步正则化牛顿算法,其在适当条件下证明了该方法具有局部收敛性。在局部误差界的条件下,该方法具有三阶收敛速度。此外我们还进行了数值实验,数值结果显示,与单步正则化牛顿法相比我们有更少的迭代次数更快的迭代速度,说明两步正则化牛顿法比后者更有效。
In this paper, we propose a two-step regularized Newton algorithm for solving non-convex uncon-strained optimization problems within the trust region framework. Under appropriate conditions, we prove that this method possesses local convergence. Under the condition of a local error bound, the method exhibits third-order convergence rate. Additionally, numerical experiments are con-ducted, and the results show that our two-step regularized Newton method outperforms the sin-gle-step regularized Newton method in terms of fewer iterations and faster convergence speed, in-dicating its higher efficiency.
%K 非凸优化,正则化牛顿法,局部误差界,信赖域
Non-Convex Optimization
%K Regularized Newton Method
%K Local Error Bound
%K Trust Region
%U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=70967