Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Preliminary verification of online optimization of the luminosity of BEPCII by using the robust conjugate direction search method  [PDF]
Hong-Fei Ji,Yi Jiao,Sheng Wang,Da-Heng Ji,Cheng-Hui Yu,Yuan Zhang,Xiao-Biao Huang
Physics , 2015,
Abstract: The robust conjugate direction search (RCDS) method has high tolerance to noise in beam experiments, and it is an efficient experimental technique for online optimization with multi-dimensional variables. In our study, this method is applied on BEPC-II, an electron-position collider, to optimize the luminosity, which can be considered as a multivariable function. Several variables are considered, including horizontal displacement, horizontal angular deviation, vertical displacement and vertical angular deviation at the interaction point. To verify the feasibility and practicability of the online optimization at the collider, the objective function, optimization time and experimental applications require careful consideration. Results from numerical simulation and online optimization experiments with RCDS are presented. The effectiveness of this method in online optimization at a collider is preliminarily verified.
An Implicit Smooth Conjugate Projection Gradient Algorithm for Optimization with Nonlinear Complementarity Constraints  [PDF]
Cong Zhang, Limin Sun, Zhibin Zhu, Minglei Fang
Applied Mathematics (AM) , 2015, DOI: 10.4236/am.2015.610152
Abstract: This paper discusses a special class of mathematical programs with equilibrium constraints. At first, by using a generalized complementarity function, the discussed problem is transformed into a family of general nonlinear optimization problems containing additional variable μ. Furthermore, combining the idea of penalty function, an auxiliary problem with inequality constraints is presented. And then, by providing explicit searching direction, we establish a new conjugate projection gradient method for optimization with nonlinear complementarity constraints. Under some suitable conditions, the proposed method is proved to possess global and superlinear convergence rate.
A conjugate direction method for approximating the analytic center of a polytope
Kojima Masakazu,Megiddo Nimrod,Mizuno Shinji
Journal of Inequalities and Applications , 1998,
Abstract: The analytic center of an -dimensional polytope with a nonempty interior is defined as the unique minimizer of the logarithmic potential function over . It is shown that one cycle of a conjugate direction method, applied to the potential function at any such that , generates a point such that .
Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization  [cached]
Liu Jinkui,Wang Shaoheng
Journal of Inequalities and Applications , 2011,
Abstract: In this paper, an efficient modified nonlinear conjugate gradient method for solving unconstrained optimization problems is proposed. An attractive property of the modified method is that the generated direction in each step is always descending without any line search. The global convergence result of the modified method is established under the general Wolfe line search condition. Numerical results show that the modified method is efficient and stationary by comparing with the well-known Polak-Ribiére-Polyak method, CG-DESCENT method and DSP-CG method using the unconstrained optimization problems from More and Garbow (ACM Trans Math Softw 7, 17-41, 1981), so it can be widely used in scientific computation. Mathematics Subject Classification (2010) 90C26 · 65H10
Steepest Descent Preconditioning for Nonlinear GMRES Optimization  [PDF]
Hans De Sterck
Mathematics , 2011,
Abstract: Steepest descent preconditioning is considered for the recently proposed nonlinear generalized minimal residual (N-GMRES) optimization algorithm for unconstrained nonlinear optimization. Two steepest descent preconditioning variants are proposed. The first employs a line search, while the second employs a predefined small step. A simple global convergence proof is provided for the N-GMRES optimization algorithm with the first steepest descent preconditioner (with line search), under mild standard conditions on the objective function and the line search processes. Steepest descent preconditioning for N-GMRES optimization is also motivated by relating it to standard non-preconditioned GMRES for linear systems in the case of a quadratic optimization problem with symmetric positive definite operator. Numerical tests on a variety of model problems show that the N-GMRES optimization algorithm is able to very significantly accelerate convergence of stand-alone steepest descent optimization. Moreover, performance of steepest-descent preconditioned N-GMRES is shown to be competitive with standard nonlinear conjugate gradient and limited-memory Broyden-Fletcher-Goldfarb-Shanno methods for the model problems considered. These results serve to theoretically and numerically establish steepest-descent preconditioned N-GMRES as a general optimization method for unconstrained nonlinear optimization, with performance that appears promising compared to established techniques. In addition, it is argued that the real potential of the N-GMRES optimization framework lies in the fact that it can make use of problem-dependent nonlinear preconditioners that are more powerful than steepest descent (or, equivalently, N-GMRES can be used as a simple wrapper around any other iterative optimization process to seek acceleration of that process), and this potential is illustrated with a further application example.
An improved BFGS search direction using exact line search for solving unconstrained optimization problems
A. Z. M. Sofi,M. Mamat,I. Mohd
Applied Mathematical Sciences , 2013,
Abstract: BFGS is one of the Hessian update formula in the well known Quasi-Newtonmethod. In this paper, we introduced a parametric hybrid search direction forBFGS algorithm using the conjugate gradient minimization technique. Undersuitable conditions/ suitable parameter, we proved that the proposed parametrichybrid search direction is globally converge by using the exact line search. At theend of this paper, we showed the numerical results of proposed BFGS algorithmbased on number of iteration, number of function evaluation and CPU timecomputing of the several of unconstrained optimization problems tested.
A Reduced Preconditional Conjugate Gradient Path Method for LinearEquality Constrained Optimization

Lin Tao,Zhu Detong,

系统科学与数学 , 2007,
Abstract: A reduced preconditional conjugate gradient path method with nonmonotonic technique for linear equality constrained optimization problem is proposed. By using the generalized elimination method, the subproblem is equivalent to an unconstrained optimization problem in the null space of constrained matrix. We develop preconditioners based on an extended system. By employing the reduced preconditional conjugate gradient path search strategy, we obtain an iterative direction by solving the quadratic model as well as the iterative step. Based on the good properties of the conjugate gradient path, the global convergence results of the proposed algorithm are proved while fast local superlinear convergence rate is established under some reasonable conditions. Furthermore, numerical results indicate that the algorithm is feasible and effective.
A Conjugate Gradient Method for Unconstrained Optimization Problems  [PDF]
Gonglin Yuan
International Journal of Mathematics and Mathematical Sciences , 2009, DOI: 10.1155/2009/329623
Abstract: A hybrid method combining the FR conjugate gradient method and the WYL conjugate gradient method is proposed for unconstrained optimization problems. The presented method possesses the sufficient descent property under the strong Wolfe-Powell (SWP) line search rule relaxing the parameter <1. Under the suitable conditions, the global convergence with the SWP line search rule and the weak Wolfe-Powell (WWP) line search rule is established for nonconvex function. Numerical results show that this method is better than the FR method and the WYL method.
On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems  [PDF]
Anders Forsgren,Tove Odland
Mathematics , 2014, DOI: 10.1007/s10589-014-9677-5
Abstract: It is well known that the conjugate gradient method and a quasi-Newton method, using any well-defined update matrix from the one-parameter Broyden family of updates, produce identical iterates on a quadratic problem with positive-definite Hessian. This equivalence does not hold for any quasi-Newton method. We define precisely the conditions on the update matrix in the quasi-Newton method that give rise to this behavior. We show that the crucial facts are, that the range of each update matrix lies in the last two dimensions of the Krylov subspaces defined by the conjugate gradient method and that the quasi-Newton condition is satisfied. In the framework based on a sufficient condition to obtain mutually conjugate search directions, we show that the one-parameter Broyden family is complete. A one-to-one correspondence between the Broyden parameter and the non-zero scaling of the search direction obtained from the corresponding quasi-Newton method compared to the one obtained in the conjugate gradient method is derived. In addition, we show that the update matrices from the one-parameter Broyden family are almost always well-defined on a quadratic problem with positive-definite Hessian. The only exception is when the symmetric rank-one update is used and the unit steplength is taken in the same iteration. In this case it is the Broyden parameter that becomes undefined.

Bao Jifeng,Zhu Detong,

计算数学 , 2009,
Abstract: In this paper,we propose a new approach of affine scaling interior discrete conjugate gradient path for solving bound constrained nonlinear optimization.We get the iterative direction by solving quadratic model via constructing preconditioned conjugate gradient path.By combining interior backtracking line search,we obtain the next iteration.Global convergence and local superlinear convergence rate of the proposed algorithm are established on some reasonable conditions.Finally,we present some numerical resul...
Page 1 /100
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.