全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

A Spline Smoothing Newton Method for Distance Regression with Bound Constraints

DOI: 10.1155/2013/393482

Full-Text   Cite this paper   Add to My Lib

Abstract:

Orthogonal distance regression is arguably the most common criterion for fitting a model to data with errors in the observations. It is not appropriate to force the distances to be orthogonal, when angular information is available about the measured data points. We consider here a natural generalization of a particular formulation of that problem which involves the replacement of norm by norm. This criterion may be a more appropriate one in the context of accept/reject decisions for manufacture parts. For distance regression with bound constraints, we give a smoothing Newton method which uses cubic spline and aggregate function, to smooth max function. The main spline smoothing technique uses a smooth cubic spline instead of max function and only few components in the max function are computed; hence it acts also as an active set technique, so it is more efficient for the problem with large amounts of measured data. Numerical tests in comparison to some other methods show that the new method is very efficient. 1. Introduction For fitting curves or surfaces to observed or measured data, a common criterion is orthogonal distance regression (ODR). Given the data pairs , , where is the independent variable and is the dependent variable; suppose where is a vector of parameters to be determined. We assumed that is the random error associated with and that is the random error associated with . To be precise, we relate the quantities , , , and to As shown in Boggs et al. [1] this gives rise to the ODR problem given by The ODR problem can be solved by the Gauss-Newton or Levenberg-Marquardt methods (see [1, 2]). The general form of the bounded constrained ODR problem can be expressed by where and are vectors of length that provide the lower and upper bounds on , respectively. Zwolak et al. give the algorithm to handle (4) in [3]. It is not appropriate to force the distances to be orthogonal, when angular information is available about the measured data points, such as the rotated cone fitting problem and rotated paraboloid fitting problem. Then, (4) becomes where , ? is an orthogonal matrix, , , , , and and are vectors of length . When the least squares norm is not appropriate, problem (5) can be generalized to use other measures in a variety of ways. Most generalizations have been of formulations (5), with the norms replaced with other norms. We consider here norms. It may be a more appropriate one in the context of accept/reject decisions for manufacture parts (see [4]). In this paper, we consider the following distance regression with bound constraints. Let

References

[1]  P. T. Boggs, R. H. Byrd, and R. B. Schnabel, “A stable and efficient algorithm for nonlinear orthogonal distance regression,” SIAM Journal on Scientific Computing, vol. 8, pp. 1052–1078, 1987.
[2]  R. Strebel, D. Sourlier, and W. Gander, “A comparison of orthogonal least squares fitting in coordinate metrology,” in Recent Advances in Total Least Squares and Errors-in-Variables Techniques, S. Van Huffel, Ed., pp. 249–258, SIAM, Philadelphia, Pa, USA, 1997.
[3]  J. W. Zwolak, P. T. Boggs, and L. T. Watson, “Algorithm 869: ODRPACK95: a weighted orthogonal distance regression code with bound constraints,” ACM Transactions on Mathematical Software, vol. 33, no. 4, Article ID 1268782, 2007.
[4]  I. Al-Subaihi and G. A. Watson, “Fitting parametric curves and surfaces by distance regression,” BIT Numerical Mathematics, vol. 45, no. 3, pp. 443–461, 2005.
[5]  N. Z. Shor, Minimization Methods for Non-Differentiable Functions, Springer, New York, NY, USA, 1985.
[6]  W. Murray and L. M. Overton, “A projected Lagrangian algorithm for nonlinear minimax optimization,” SIAM Journal on Scientific Computing, vol. 1, pp. 345–370, 1980.
[7]  R. S. Womersley and R. Fletcher, “An algorithm for composite nonsmooth optimization problems,” Journal of Optimization Theory and Applications, vol. 48, no. 3, pp. 493–523, 1986.
[8]  J. L. Zhou and A. L. Tits, “An SQP algorithm for finely discretized continuous minimax problems and other minimax problems with many objective functions,” SIAM Journal on Optimization, vol. 6, no. 2, pp. 461–487, 1996.
[9]  A. Frangioni, “Generalized bundle methods,” SIAM Journal on Optimization, vol. 13, no. 1, pp. 117–156, 2003.
[10]  M. Gaudioso and M. F. Monaco, “A bundle type approach to the unconstrained minimization of convex nonsmooth functions,” Mathematical Programming, vol. 23, no. 2, pp. 216–226, 1982.
[11]  K. C. Kiwiel, Methods of Descent for Nondifferentiable Optimization, vol. 1133 of Lecture Notes in Mathematics, Springer, Berlin, Germany, 1985.
[12]  J. Zowe, “Nondifferentiable optimization: a motivation and a short introduction into the sub-gradient and the bundle concept,” in Computational Mathematical Programming, K. Schittkowski, Ed., vol. 15 of NATO SAI Series, pp. 323–356, Springer, New York, NY, USA, 1985.
[13]  J. W. Bandler and C. Charalambous, “Practical least pth optimization of networks,” IEEE Transactions on Microwave Theory and Techniques, vol. 20, no. 12, pp. 834–840, 1972.
[14]  C. Charalambous, “Acceleration of the least pth algorithm for minimax optimization with engineering applications,” Mathematical Programming, vol. 17, no. 1, pp. 270–297, 1979.
[15]  C. Gfgola and S. Gomez, “A regularization method for solving the finite convex min-max problem,” The SIAM Journal on Numerical Analysis, vol. 27, pp. 1621–1634, 1990.
[16]  D. Q. Mayne and E. Polak, “Nondifferential optimization via adaptive smoothing,” Journal of Optimization Theory and Applications, vol. 43, no. 4, pp. 601–613, 1984.
[17]  E. Polak, J. O. Royset, and R. S. Womersley, “Algorithms with adaptive smoothing for finite minimax problems,” Journal of Optimization Theory and Applications, vol. 119, no. 3, pp. 459–484, 2003.
[18]  R. A. Polyak, “Smooth optimization methods for minimax problems,” SIAM Journal on Control and Optimization, vol. 26, no. 6, pp. 1274–1286, 1988.
[19]  S. Xu, “Smoothing method for minimax problems,” Computational Optimization and Applications, vol. 20, no. 3, pp. 267–279, 2001.
[20]  I. Zang, “A smoothing technique for minCmax optimization,” Mathematics Programs, vol. 19, pp. 61–77, 1980.
[21]  Y. Xiao and B. Yu, “A truncated aggregate smoothing Newton method for minimax problems,” Applied Mathematics and Computation, vol. 216, no. 6, pp. 1868–1879, 2010.
[22]  G. H. Zhao, Z. R. Wang, and H. N. Mou, “Uniform approximation of min/max functions by smooth splines,” Journal of Computational and Applied Mathematics, vol. 236, pp. 699–703, 2011.
[23]  L. Dong and B. Yu, “A spline smoothing Newton method for finite minimax problems,” preprint.
[24]  E. Polak, Optimization: Algorithms and Consistent Approximations, Springer, New York, NY, USA, 1997.
[25]  L. Dong, B. Yu, and G. H. Zhao, “A smoothing spline homotopy method for nonconvex nonlinear programming,” preprint.
[26]  B. Yu, G. C. Feng, and S. L. Zhang, “The aggregate constraint homotopy method for nonconvex nonlinear programming,” Nonlinear Analysis: Theory, Methods and Applications, vol. 45, no. 7, pp. 839–847, 2001.
[27]  Y. Xiao, B. Yu, and D. L. Wang, “Truncated smoothing newton method for fitting rotated cones,” Journal of Mathematical Research and Exposition, vol. 30, pp. 159–166, 2010.
[28]  H. Sp?th, “Least squares fitting with rotated paraboloids,” Mathematical Communications, vol. 6, no. 2, pp. 173–179, 2001.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133