%0 Journal Article %T A Global Convergence of LS-CD Hybrid Conjugate Gradient Method %A Xiangfei Yang %A Zhijun Luo %A Xiaoyu Dai %J Advances in Numerical Analysis %D 2013 %I Hindawi Publishing Corporation %R 10.1155/2013/517452 %X Conjugate gradient method is one of the most effective algorithms for solving unconstrained optimization problem. In this paper, a modified conjugate gradient method is presented and analyzed which is a hybridization of known LS and CD conjugate gradient algorithms. Under some mild conditions, the Wolfe-type line search can guarantee the global convergence of the LS-CD method. The numerical results show that the algorithm is efficient. 1. Introduction Consider the following nonlinear programs: where denotes an -dimensional Euclidean space and is continuously differentiable function. As you know, conjugate gradient method is a line search method that takes the following form: where is a descent direction of at and is a stepsize obtained by some one-dimensional line search. If is the current iterate, we denote , , and , respectively. If is available and inverse, then leads to the Newton method and results in the steepest descent method [1]. The search direction is generally required to satisfy , which guarantees that is a descent direction of at [2]. In order to guarantee the global convergence, we sometimes require to satisfy a sufficient descent condition as follows: where is a constant and is the Euclidean norm. In line search methods, the well-known conjugate gradient method has the following form: Different conjugate gradient algorithms correspond to different choices for the parameter , where can be defined by or by other formulae. The corresponding methods are called FR (Fletcher-Reeves) [3], PRP (Polak-Ribi¨Śre-Polyak) [4, 5], DY (Dai-Yuan) [6], CD (conjugate descent [7]), LS (Liu-Storey [8]), and HS (Hestenes-Stiefel [9]) conjugate gradient method, respectively. Although the above mentioned conjugate gradient algorithms are equivalent to each other for minimizing strong convex quadratic functions under exact line search, they have different performance when using them to minimize nonquadratic functions or when using inexact line searches. For general objective function, the FR, DY, and CD methods have strong convergence properties, but they may have modest practical performance due to jamming. On the other hand, the methods of PRP, LS, and HS in general may not be convergent, but they often have better computational performance. Touati-Ahmed and Storey [10] have given the first hybrid conjugate algorithm; the method is combinations of different conjugate gradient algorithms; mainly it is being proposed to avoid the jamming phenomenon. Recently, some kinds of new hybrid conjugate gradient methods are given in [11¨C17]. Based on the new method, we %U http://www.hindawi.com/journals/ana/2013/517452/