|
A Mixed Line Search Smoothing Quasi-Newton Method for Solving Linear Second-Order Cone Programming ProblemDOI: 10.1155/2013/230717 Abstract: Firstly, we give the Karush-Kuhn-Tucker (KKT) optimality condition of primal problem and introduce Jordan algebra simply. On the basis of Jordan algebra, we extend smoothing Fischer-Burmeister (F-B) function to Jordan algebra and make the complementarity condition smoothing. So the first-order optimization condition can be reformed to a nonlinear system. Secondly, we use the mixed line search quasi-Newton method to solve this nonlinear system. Finally, we prove the globally and locally superlinear convergence of the algorithm. 1. Introduction Linear second-order cone programming (SOCP) problems are convex optimization problems which minimize a linear function over the intersection of an affine linear manifold with the Cartesian product of second-order cones. Linear programming (LP), Linear second-order cone programming (SOCP), and semidefinite programming (SDP) all belong to symmetric cone analysis. LP is a special example of SOCP and SOCP is a special case of SDP. SOCP can be solved by the corresponding to the algorithm of SDP, and SOCP also has effectual solving method. Nesterov and Todd [1, 2] had an earlier research on primal-dual interior point method. In the rescent, it gives quick development about the solving method for SOCP. Many scholars concentrate on SOCP. The primal and dual standard forms of the linear SOCP are given by where the second-order cone : where refers to the standard Euclidean norm. In this paper, the vectors , , and and the matrix are partitioned conformally, namely Except for interior point method, semismoothing and smoothing Newton method can also be used to solve SOCP. In [3], the Karush-Kuhn-Tucker (KKT) optimality condition of primal-dual problem was reformulated to a semi-smoothing nonlinear system, which was solved by Newton method with central path. In [4], the KKT optimality condition of primal-dual problem was reformed to a smoothing nonlinear equations, then it was solved by combining Newton method with central path. References [3, 4] gave globally and locally quadratic convergent of the algorithm. 2. Preliminaries and Algorithm In this section, we introduce the Jordan algebra and give the nonlinear system, which comes from the Karush-Kuhn-Tucker (KKT) optimality condition. At last, we introduce two kinds of derivative-free line search rules. Associated with each vector , there is an arrow-shaped matrix which is defined as follows: Euclidean Jordan algebra is associated with second-order cones. For now we assume that all vectors consist of a single block . For two vectors and , define the following multiplication:
|