Post Reply 
On Convergence Rates of Root-Seeking Methods
03-11-2018, 08:01 PM (This post was last modified: 03-11-2018 08:10 PM by emece67.)
Post: #33
RE: On Convergence Rates of Root-Seeking Methods
(03-11-2018 02:26 PM)Claudio L. Wrote:  Single starting-point:
* Newton-Raphson
<add others here!>

I'll add the Ostrowsky-Traub method. It achieves 4th order convergence with 3 function evaluations on each iteration, being simple enough.

When I wrote a complex solver for the wp34s (now it is the complex solver on its library) I made tests with some different single starting point solvers, up to 1024 digits, and IMHO, this method gave the best overall performance (see this post here and the next one on the same thread). It has an efficiency index of 1.587, higher than Halley (1.442) & Newton (1.414), and although there are methods with a better efficiency index (Kung-Traub has 1.682) they are more complex and, in practice, in many cases slower because of this complexity.

In any case, keep in mind that many of these single starting point methods rely on the computation of the derivative. Many times this derivative is estimated with:
\[y'(x)\approx{f(x+\Delta x)-f(x)\over\Delta x}\]
and, if you do not use a small enough \(\Delta x\), you end up with a, say, 6 correct digits estimation of the derivative, not adequate to get a new 2000 correct digits estimation of the root from a previous 500 correct digits root estimation. My advice is, if you are working with \(d\) digits, use relative increments in \(x\) around \(10^{-d/2}\). In my tests, the way you estimate the derivative may have more impact on the execution time than the algorithm used.

Regards.
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
RE: On Convergence Rates of Root-Seeking Methods - emece67 - 03-11-2018 08:01 PM



User(s) browsing this thread: 3 Guest(s)