All Title Author
Keywords Abstract

Sufficient Descent Conjugate Gradient Methods for Solving Convex Constrained Nonlinear Monotone Equations

DOI: 10.1155/2014/305643

Full-Text   Cite this paper   Add to My Lib


Two unified frameworks of some sufficient descent conjugate gradient methods are considered. Combined with the hyperplane projection method of Solodov and Svaiter, they are extended to solve convex constrained nonlinear monotone equations. Their global convergence is proven under some mild conditions. Numerical results illustrate that these methods are efficient and can be applied to solve large-scale nonsmooth equations. 1. Introduction Consider the constrained monotone equations where is continuous and satisfies the following monotonicity: and is a nonempty closed convex set. Under these conditions, the solution set of problem (1) is convex [1]. This problem has many applications, such as the power flow equation [2, 3] and some variational inequality problems which can be converted into (1) by means of fixed point maps or normal maps if the underlying function satisfies some coercive conditions [4]. In recent years, the study of the iterative methods to solve problem (1) with has received much attention. The pioneer work was introduced by Solodov and Svaiter in [5], where the proposed method was called inexact Newton method which combines elements of Newton method, proximal point method, and projection strategy and required that is differentiable. Its convergence was proven without any regularity assumptions. And a further study about its convergence properties was given by Zhou and Toh [6]. Then utilizing the projection strategy in [5], Zhou and Li extended the BFGS methods [7] and the limited memory BFGS methods [8] to solve problem (1) with . A significant improvement is that these methods converge globally without requiring the differentiability of . Conjugate gradient methods are another class of numerical methods [9–15] after spectral gradient methods [16–18] extended to solve problem (1), and the study of this aspect is just catching up. As is well known, conjugate gradient methods are very efficient to solve large-scale unconstrained nonlinear optimization problem where is smooth, due to their simple iterations and their low memory requirements. In [19], they were divided into three categories, that is, early conjugate gradient methods, descent conjugate gradient methods, and sufficient descent conjugate gradient methods. Early conjugate gradient methods rarely ensure a (sufficient) descent condition where is the gradient of at (the th iteration) and is a search direction, while the later two categories always satisfy the descent property. One well-known sufficient descent conjugate gradient method, namely, CG_DESCENT, was presented by Hager


[1]  J. M. Ortega and W. C. Rheinboldt, Iterative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, NY, USA, 1970.
[2]  M. E. El-Hawary, Optimal Power Flow: Solution Techniques, Requirement and Challenges, IEEE Service Center, Piscataway, NJ, USA, 1996.
[3]  A. J. Wood and B. F. Wollenberg, Power Generations, Operations, and Control, John Wiley & Sons, New York, NY, USA, 1996.
[4]  Y.-B. Zhao and D. Li, “Monotonicity of fixed point and normal mappings associated with variational inequality and its application,” SIAM Journal on Optimization, vol. 11, no. 4, pp. 962–973, 2001.
[5]  M. V. Solodov and B. F. Svaiter, “A globally convergent inexact Newton method for systems of monotone equations,” in Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, M. Fukushima and L. Qi, Eds., vol. 22, pp. 355–369, Kluwer Academic Publishers, Dordrecht, The Netherlands, 1999.
[6]  G. Zhou and K. C. Toh, “Superlinear convergence of a Newton-type algorithm for monotone equations,” Journal of Optimization Theory and Applications, vol. 125, no. 1, pp. 205–221, 2005.
[7]  W.-J. Zhou and D.-H. Li, “A globally convergent BFGS method for nonlinear monotone equations without any merit functions,” Mathematics of Computation, vol. 77, no. 264, pp. 2231–2240, 2008.
[8]  W. Zhou and D. Li, “Limited memory BFGS method for nonlinear monotone equations,” Journal of Computational Mathematics, vol. 25, no. 1, pp. 89–96, 2007.
[9]  Y. Xiao and H. Zhu, “A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing,” Journal of Mathematical Analysis and Applications, vol. 405, no. 1, pp. 310–319, 2013.
[10]  W. Cheng, “A PRP type method for systems of monotone equations,” Mathematical and Computer Modelling, vol. 50, no. 1-2, pp. 15–20, 2009.
[11]  Q.-R. Yan, X.-Z. Peng, and D.-H. Li, “A globally convergent derivative-free method for solving large-scale nonlinear monotone equations,” Journal of Computational and Applied Mathematics, vol. 234, no. 3, pp. 649–657, 2010.
[12]  Q. Li and D.-H. Li, “A class of derivative-free methods for large-scale nonlinear monotone equations,” IMA Journal of Numerical Analysis, vol. 31, no. 4, pp. 1625–1635, 2011.
[13]  M. Ahookhosh, K. Amini, and S. Bahrami, “Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations,” Numerical Algorithms, vol. 64, no. 1, pp. 21–42, 2013.
[14]  M. Li, “A LS type method for solving large-scale nonlinear monotone equations,” Numerical Functional Analysis and Optimization, 2013.
[15]  W. Cheng, Y. Xiao, and Q.-J. Hu, “A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations,” Journal of Computational and Applied Mathematics, vol. 224, no. 1, pp. 11–19, 2009.
[16]  W. La Cruz and M. Raydan, “Nonmonotone spectral methods for large-scale nonlinear systems,” Optimization Methods & Software, vol. 18, no. 5, pp. 583–599, 2003.
[17]  L. Zhang and W. Zhou, “Spectral gradient projection method for solving nonlinear monotone equations,” Journal of Computational and Applied Mathematics, vol. 196, no. 2, pp. 478–484, 2006.
[18]  Z. Yu, J. Lin, J. Sun, Y. Xiao, L. Liu, and Z. Li, “Spectral gradient projection method for monotone nonlinear equations with convex constraints,” Applied Numerical Mathematics, vol. 59, no. 10, pp. 2416–2423, 2009.
[19]  Y. H. Dai, “Nonlinear conjugate gradient methods,” in Wiley Encyclopedia of Operations Research and Management Science, J. J. Cochran, L. A. Cox Jr, P. Keskinocak, J. P. Kharoufeh, and J. C. Smith, Eds., John Wiley & Sons, 2011.
[20]  W. W. Hager and H. Zhang, “A new conjugate gradient method with guaranteed descent and an efficient line search,” SIAM Journal on Optimization, vol. 16, no. 1, pp. 170–192, 2005.
[21]  W. W. Hager and H. Zhang, “Algorithm 851: , a conjugate gradient method with guaranteed descent,” ACM Transactions on Mathematical Software, vol. 32, no. 1, pp. 113–137, 2006.
[22]  G. Yu, L. Guan, and W. Chen, “Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization,” Optimization Methods & Software, vol. 23, no. 2, pp. 275–293, 2008.
[23]  W. Cheng and Q. Liu, “Sufficient descent nonlinear conjugate gradient methods with conjugacy condition,” Numerical Algorithms, vol. 53, no. 1, pp. 113–131, 2010.
[24]  E. Polak and G. Ribière, “Note sur la convergence de méthodes de directions conjuguées,” Revue Francaise Dinformatique et Derecherche Opérationnelle, Série Rouge, vol. 3, no. 16, pp. 35–43, 1969.
[25]  B. T. Polyak, “The conjugate gradient method in extremal problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969.
[26]  Y.-H. Dai and C.-X. Kou, “A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search,” SIAM Journal on Optimization, vol. 23, no. 1, pp. 296–320, 2013.
[27]  W. W. Hager and H. Zhang, “The limited memory conjugate gradient method,” SIAM Journal on Optimization, vol. 23, no. 4, pp. 2150–2168, 2013.
[28]  M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–436, 1952.
[29]  C. Wang, Y. Wang, and C. Xu, “A projection method for a system of nonlinear monotone equations with convex constraints,” Mathematical Methods of Operations Research, vol. 66, no. 1, pp. 33–46, 2007.
[30]  G. Yu, S. Niu, and J. Ma, “Multivariate spectral gradient projection method for nonlinear monotone equations with convex constraints,” Journal of Industrial and Management Optimization, vol. 9, no. 1, pp. 117–129, 2013.
[31]  N. Yamashita and M. Fukushima, “Modified Newton methods for solving a semismooth reformulation of monotone complementarity problems,” Mathematical Programming, vol. 76, no. 3, pp. 469–491, 1997.
[32]  E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002.


comments powered by Disqus