All Title Author
Keywords Abstract

Publish in OALib Journal
ISSN: 2333-9721
APC: Only $99

ViewsDownloads

Relative Articles

More...

Three New Optimal Fourth-Order Iterative Methods to Solve Nonlinear Equations

DOI: 10.1155/2013/957496

Full-Text   Cite this paper   Add to My Lib

Abstract:

We present new modifications to Newton's method for solving nonlinear equations. The analysis of convergence shows that these methods have fourth-order convergence. Each of the three methods uses three functional evaluations. Thus, according to Kung-Traub's conjecture, these are optimal methods. With the previous ideas, we extend the analysis to functions with multiple roots. Several numerical examples are given to illustrate that the presented methods have better performance compared with Newton's classical method and other methods of fourth-order convergence recently published. 1. Introduction One of the most important problems in numerical analysis is solving nonlinear equations. To solve these equations, we can use iterative methods such as Newton's method and its variants. Newton's classical method for a single nonlinear equation , where is a single root, is written as which converges quadratically in some neighborhood of . Taking , many modifications of Newton's method were recently published. In [1], Noor and Khan presented a fourth-order optimal method as defined by which uses three functional evaluations. In [2], Cordero et al. proposed a fourth-order optimal method as defined by which also uses three functional evaluations. Chun presented a third-order iterative formula [3] as defined by which uses three functional evaluations, where is any iterative function of second order. Li et al. presented a fifth-order iterative formula in [4] as defined by which uses five functional evaluations. The main goal and motivation in the development of new methods are to obtain a better computational efficiency. In other words, it is advantageous to obtain the highest possible convergence order with a fixed number of functional evaluations per iteration. In the case of multipoint methods without memory, this demand is closely connected with the optimal order considered in the Kung-Traub’s conjecture. Kung-Traub's Conjecture (see [5]). Multipoint iterative methods (without memory) requiring functional evaluations per iteration have the order of convergence at most . Multipoint methods which satisfy Kung-Traub's conjecture (still unproved) are usually called optimal methods; consequently, is the optimal order. The computational efficiency of an iterative method of order , requiring function evaluations per iteration, is most frequently calculated by Ostrowski-Traub's efficiency index [6] . On the case of multiple roots, the quadratically convergent modified Newton's method [7] is where is the multiplicity of the root. For this case, there are several methods

References

[1]  M. A. Noor and W. A. Khan, “Fourth-order iterative method free from second derivative for solving nonlinear equations,” Applied Mathematical Sciences, vol. 6, no. 93–96, pp. 4617–4625, 2012.
[2]  A. Cordero, J. L. Hueso, E. Martínez, and J. R. Torregrosa, “New modifications of Potra-Pták's method with optimal fourth and eighth orders of convergence,” Journal of Computational and Applied Mathematics, vol. 234, no. 10, pp. 2969–2976, 2010.
[3]  C. Chun, “A geometric construction of iterative formulas of order three,” Applied Mathematics Letters, vol. 23, no. 5, pp. 512–516, 2010.
[4]  Z. Li, C. Peng, T. Zhou, and J. Gao, “A new Newton-type method for solving nonlinear equations with any integer order of convergence,” Journal of Computational Information Systems, vol. 7, no. 7, pp. 2371–2378, 2011.
[5]  H. T. Kung and J. F. Traub, “Optimal order of one-point and multipoint iteration,” Journal of the Association for Computing Machinery, vol. 21, pp. 643–651, 1974.
[6]  A. M. Ostrowski, Solution of Equations and Systems of Equations, Academic Press, New York, NY, USA, 1966.
[7]  A. Ralston and P. Rabinowitz, A First Course in Numerical Analysis, McGraw-Hill, 1978.
[8]  E. Halley, “A new, exact and easy method of finding the roots of equations generally and that without any previous reduction,” Philosophical Transactions of the Royal Society of London, vol. 18, pp. 136–148, 1964.
[9]  E. Hansen and M. Patrick, “A family of root finding methods,” Numerische Mathematik, vol. 27, no. 3, pp. 257–269, 1977.
[10]  N. Osada, “An optimal multiple root-finding method of order three,” Journal of Computational and Applied Mathematics, vol. 51, no. 1, pp. 131–133, 1994.
[11]  H. D. Victory and B. Neta, “A higher order method for multiple zeros of nonlinear functions,” International Journal of Computer Mathematics, vol. 12, no. 3-4, pp. 329–335, 1983.
[12]  R. F. King, “A family of fourth order methods for nonlinear equations,” SIAM Journal on Numerical Analysis, vol. 10, pp. 876–879, 1973.
[13]  C. Chun and B. Neta, “A third-order modification of Newton's method for multiple roots,” Applied Mathematics and Computation, vol. 211, no. 2, pp. 474–479, 2009.
[14]  A. Cordero and J. R. Torregrosa, “A class of Steffensen type methods with optimal order of convergence,” Applied Mathematics and Computation, vol. 217, no. 19, pp. 7653–7659, 2011.
[15]  Q. Zheng, J. Li, and F. Huang, “An optimal Steffensen-type family for solving nonlinear equations,” Applied Mathematics and Computation, vol. 217, no. 23, pp. 9592–9597, 2011.
[16]  M. S. Petkovi?, J. D?uni?, and B. Neta, “Interpolatory multipoint methods with memory for solving nonlinear equations,” Applied Mathematics and Computation, vol. 218, no. 6, pp. 2533–2541, 2011.
[17]  J. D?uni?, M. S. Petkovi?, and L. D. Petkovi?, “Three-point methods with and without memory for solving nonlinear equations,” Applied Mathematics and Computation, vol. 218, no. 9, pp. 4917–4927, 2012.

Full-Text

Contact Us

[email protected]

QQ:3279437679

WhatsApp +8615387084133