Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations. 1. Introduction Consider the constrained monotone equations where is continuous and satisfies the following monotonicity: and is a nonempty closed convex set. Under these conditions, the solution set of problem (1) is convex . This problem has many applications, such as the power flow equation [2, 3] and some variational inequality problems which can be converted into (1) by means of fixed point maps or normal maps if the underlying function satisfies some coercive conditions . In recent years, the study of the iterative methods to solve problem (1) with has received much attention. The pioneer work was introduced by Solodov and Svaiter in , where the proposed method was called inexact Newton method which combines elements of Newton method, proximal point method, and projection strategy and required that is differentiable. Its convergence was proven without any regularity assumptions. And a further study about its convergence properties was given by Zhou and Toh . Then utilizing the projection strategy in , Zhou and Li extended the BFGS methods  and the limited memory BFGS methods  to solve problem (1) with . A significant improvement is that these methods converge globally without requiring the differentiability of . Conjugate gradient methods are another class of numerical methods [9–15] after spectral gradient methods [16–18] extended to solve problem (1), and the study of this aspect is just catching up. As is well known, conjugate gradient methods are very efficient to solve large-scale unconstrained nonlinear optimization problem where is smooth, due to their simple iterations and their low memory requirements. In , they were divided into three categories, that is, early conjugate gradient methods, descent conjugate gradient methods, and sufficient descent conjugate gradient methods. Early conjugate gradient methods rarely ensure a (sufficient) descent condition where is the gradient of at (the th iteration) and is a search direction, while the later two categories always satisfy the descent property. One well-known sufficient descent conjugate gradient method, namely, CG_DESCENT, was presented by Hager
M. V. Solodov and B. F. Svaiter, “A globally convergent inexact Newton method for systems of monotone equations,” in Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, M. Fukushima and L. Qi, Eds., vol. 22, pp. 355–369, Kluwer Academic Publishers, Dordrecht, The Netherlands, 1999.
Y. Xiao and H. Zhu, “A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing,” Journal of Mathematical Analysis and Applications, vol. 405, no. 1, pp. 310–319, 2013.
W. Cheng, Y. Xiao, and Q.-J. Hu, “A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations,” Journal of Computational and Applied Mathematics, vol. 224, no. 1, pp. 11–19, 2009.
Z. Yu, J. Lin, J. Sun, Y. Xiao, L. Liu, and Z. Li, “Spectral gradient projection method for monotone nonlinear equations with convex constraints,” Applied Numerical Mathematics, vol. 59, no. 10, pp. 2416–2423, 2009.
Y. H. Dai, “Nonlinear conjugate gradient methods,” in Wiley Encyclopedia of Operations Research and Management Science, J. J. Cochran, L. A. Cox Jr, P. Keskinocak, J. P. Kharoufeh, and J. C. Smith, Eds., John Wiley & Sons, 2011.
G. Yu, L. Guan, and W. Chen, “Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization,” Optimization Methods & Software, vol. 23, no. 2, pp. 275–293, 2008.
G. Yu, S. Niu, and J. Ma, “Multivariate spectral gradient projection method for nonlinear monotone equations with convex constraints,” Journal of Industrial and Management Optimization, vol. 9, no. 1, pp. 117–129, 2013.