全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

求解无约束优化问题的DaiYuan记忆梯度法

Keywords: DaiYuan共轭梯度法,记忆梯度法,Wolfe线搜索,收敛性

Full-Text   Cite this paper   Add to My Lib

Abstract:

将DaiYuan共轭梯度法的前提条件βk>0改为βk<0,根据搜索方向的下降性要求,得出一个新的记忆梯度法,并做出了收敛性证明.新算法与DaiYuan共轭梯度法联系紧密.数值实验表明了该算法的有效性.

References

[1]  Dai Y H, Yuan Y. A nonlinear conjugate gradient with a strong global convergence property\[J\]. SIAM J Optimization,2000,10:177 182.
[2]  Shi Z J. A new memory gradient method under exact line saearch\[J\]. AsiaPacific J Operational Research,2003,20:275 284.
[3]  明清河. 关于超记忆梯度算法的收敛性\[J\]. 曲阜师范大学学报:自然科学版,2004,30(1):40 42.
[4]  公锦凤,于宪伟. 一种新参数下的记忆梯度法\[J\]. 西南师范大学学报:自然科学版,2007,32(6):33 36.
[5]  时贞军. Wolfe搜索下记忆梯度法的收敛性\[J\]. 应用数学学报,2006,29(1):9 18.
[6]  汤京永,董丽,张秀军. 一类新的Wolfe线性搜索下的记忆梯度法\[J\]. 山东大学学报:理学版,2009,44(7):3337.
[7]  Miele A, Cantrell J W. Study on a memory gradient method for the minimization functions\[J\]. J Optim Theory Appl,1969,3(6):459 470.
[8]  Cantrell J W. Relation between the memory gradient method and the FletcherReeves method\[J\]. J Optim Theory Appl,1969,4(1):67 71.
[9]  汤京永,时贞军. 一类新的强Wolfe线搜索下的记忆梯度法\[J\]. 曲阜师范大学学报:自然科学版,2005,31(2):24 28.
[10]  Tang Jingyong, Dong Li. A new class of nonmonotone memory gradient method and its global convergence\[J\]. Mathematical Theory and Applications,2009,29(2):58.
[11]  Zoutendijk G. Nonlinear programming, computational methods\[C\]//Abadie J. Integer and Nonlinear Programming. Amsterdam:NorthHolland,1970:37 86.
[12]  AlBaali M. Descent property and global convergence of the FletcherReeves method with inexact line search\[J\]. SIMA J Num Anal,1985,5:121 124.
[13]  Gilbert J C, Nocedal J. Global convergence properties of conjugate gradient methods for optimization\[J\]. SIAM J Optimization,1992,2(1):2142.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133