On Convergence Rates of Root-Seeking Methods
|
03-05-2018, 03:52 AM
(This post was last modified: 03-05-2018 03:54 AM by Paul Dale.)
Post: #18
|
|||
|
|||
RE: On Convergence Rates of Root-Seeking Methods
(03-02-2018 01:00 PM)Mike (Stgt) Wrote: to compute sqare roots up to 100'000 digits and more? Newton's method is quite widely used. A good initial estimate is required. The usual double precision hardware sqrt function is a start and it might even be good enough. It might be worth looking into Halley's method, which has cubic convergence, but needs the second derivative which is a constant here. However, I doubt the extra computation would be worthwhile. I think higher order Householder methods will fail because all of the higher order derivatives are zero. You could look for Karatsuba and FFT square roots. I remember them being Newton's method with improvements to the basic arithmetic operations. Quote:And what for 3rd and 4th roots? For cube root and above, I'd use Newton's because the arithmetic stays simple. Higher order methods might produce a benefit but again I doubt the extra computation is worthwhile. Quote:For the later, would a sqrt(sqrt()) be faster than a special 4th root process? I doubt it. Iterating a = \( \frac{1}{4} \left( a + \frac{z}{a^3} \right) \) once versus iterating \( \frac{1}{2} \left( a + \frac{z}{a} \right) \) twice. Crudely, it is trading two multiplies for a divide and an addition. The latter is likely to be the more difficult. I'm assuming the divide by 2 or 4 is free, if not it is worse for the double square root. I think that sqrt(sqrt()) would be faster than using a yx operation. Quote:Would be a CORDIC algorithm for so many digits outreach simple Newton (iterating a = .5 * (a + z / a))? Not a chance. CORDIC is linear in the number of digits, Newton quadratic. CORDIC might be usable to provide a sufficiently accurate starting point for Newton's method however. Pauli |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 1 Guest(s)