%0 Journal Article %T Sufficient Descent Conjugate Gradient Methods for Solving Convex Constrained Nonlinear Monotone Equations %A San-Yang Liu %A Yuan-Yuan Huang %A Hong-Wei Jiao %J Abstract and Applied Analysis %D 2014 %I Hindawi Publishing Corporation %R 10.1155/2014/305643 %X Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations. 1. Introduction Consider the constrained monotone equations where is continuous and satisfies the following monotonicity: and is a nonempty closed convex set. Under these conditions, the solution set of problem (1) is convex [1]. This problem has many applications, such as the power flow equation [2, 3] and some variational inequality problems which can be converted into (1) by means of fixed point maps or normal maps if the underlying function satisfies some coercive conditions [4]. In recent years, the study of the iterative methods to solve problem (1) with has received much attention. The pioneer work was introduced by Solodov and Svaiter in [5], where the proposed method was called inexact Newton method which combines elements of Newton method, proximal point method, and projection strategy and required that is differentiable. Its convergence was proven without any regularity assumptions. And a further study about its convergence properties was given by Zhou and Toh [6]. Then utilizing the projection strategy in [5], Zhou and Li extended the BFGS methods [7] and the limited memory BFGS methods [8] to solve problem (1) with . A significant improvement is that these methods converge globally without requiring the differentiability of . Conjugate gradient methods are another class of numerical methods [9¨C15] after spectral gradient methods [16¨C18] extended to solve problem (1), and the study of this aspect is just catching up. As is well known, conjugate gradient methods are very efficient to solve large-scale unconstrained nonlinear optimization problem where is smooth, due to their simple iterations and their low memory requirements. In [19], they were divided into three categories, that is, early conjugate gradient methods, descent conjugate gradient methods, and sufficient descent conjugate gradient methods. Early conjugate gradient methods rarely ensure a (sufficient) descent condition where is the gradient of at (the th iteration) and is a search direction, while the later two categories always satisfy the descent property. One well-known sufficient descent conjugate gradient method, namely, CG_DESCENT, was presented by Hager %U http://www.hindawi.com/journals/aaa/2014/305643/