|
Mathematics 2012
Detecting and correcting the loss of independence in nonlinear conjugate gradientAbstract: It is well known that search directions in nonlinear conjugate gradient (CG) can sometimes become nearly dependent, causing a dramatic slow-down in the convergence rate. We provide a theoretical analysis of this loss of independence. The analysis applies to the case of a strictly convex objective function and is motivated by older work of Nemirovsky and Yudin. Loss of independence can affect several of the well-known variants of nonlinear CG including Fletcher-Reeves, Polak-Ribi\`ere (nonnegative variant), and Hager-Zhang. Based on our analysis, we propose a relatively inexpensive computational test for detecting loss of independence. We also propose a method for correcting it when it is detected, which we call "subspace optimization." Although the correction method is somewhat expensive, our experiments show that in some cases, usually the most ill-conditioned ones, it yields a method much faster than any of these three variants. Even though our theory covers only strongly convex objective functions, we provide computational results to indicate that the detection and correction mechanisms may also hold promise for nonconvex optimization.
|