Post Reply 
New Root-Seeking Algorithms
04-09-2017, 09:33 PM
Post: #21
RE: New Root-Seeking Algorithms
By happenstance, I came across another (unknown to me) articles on zero-finding. One is a bunch of notes by Bill Kahan.

https://people.eecs.berkeley.edu/~wkahan...lRoots.pdf

I'll probably implement something like the Illinois algorithm with a double speed secant method. I've been using root-finding of non-linear equations to get percentage points of probability curves (the t-distribution and F-distribution mostly).
Find all posts by this user
Quote this message in a reply
04-10-2017, 05:01 AM
Post: #22
RE: New Root-Seeking Algorithms
(04-09-2017 09:33 PM)ttw Wrote:  By happenstance, I came across another (unknown to me) articles on zero-finding. One is a bunch of notes by Bill Kahan.

https://people.eecs.berkeley.edu/~wkahan...lRoots.pdf

I'll probably implement something like the Illinois algorithm with a double speed secant method. I've been using root-finding of non-linear equations to get percentage points of probability curves (the t-distribution and F-distribution mostly).

Thanks for the PDF. Kahan articles are always welcome!!
Find all posts by this user
Quote this message in a reply
04-13-2017, 01:23 PM
Post: #23
RE: New Root-Seeking Algorithms
I am currently working on the set of modified Newton's methods mentioned in the links given in an earlier message in this thread, plus other similar methods that I stumbled upon the web. So far I have about 39 methods and their variants. Some algorithms give regular equations while others give families of equations.

When I am done, I will publish the Excel file on my web site. The VBA code contains references to the articles (which I plan to also include in a ZIP file), I will include a Word doc file that will contain a very short comment on the various algorithms.

Namir

Still truckin'
Find all posts by this user
Quote this message in a reply
04-15-2017, 10:58 PM
Post: #24
RE: New Root-Seeking Algorithms
Hello All,

I have posted on my web site a ZIP file that contains the following:

1) An Excel file that tests about 59 algorithms (and variants) for modified Newton's method. The file has several worksheets to test various functions (some with different initial guesses) and accessible VBA code that shows the implementation for the various modified Newton methods.
2) A PDF that contains a short summary for the results.
3) A set of PDF file containing the articles I used to obtain the equations used in various modified Newton methods.

To download the ZIP file click here.

Enjoy!

Namir
Find all posts by this user
Quote this message in a reply
04-16-2017, 01:06 PM (This post was last modified: 04-16-2017 01:07 PM by Namir.)
Post: #25
RE: New Root-Seeking Algorithms
I rearranged the contents of the ZIP file that you can download in my last message. I created a subfolder for the articles and left my pdf-comment file and the main Excel file in the default ZIP root folder. This new arrangement will make it effortless to spot the pdf that contains my brief comments on the results.

Namir
Find all posts by this user
Quote this message in a reply
04-20-2017, 05:05 AM
Post: #26
RE: New Root-Seeking Algorithms
http://www.math.cornell.edu/~hubbard/New...tiones.pdf

How To Find All Roots of Complex Polynomials by Newton's Method

Used the dynamics of Newton's method from differing starting points to guarantee convergence to each root.
Find all posts by this user
Quote this message in a reply
11-25-2020, 09:44 PM (This post was last modified: 11-27-2020 04:00 PM by robve.)
Post: #27
RE: New Root-Seeking Algorithms
(04-15-2017 10:58 PM)Namir Wrote:  Hello All,

I have posted on my web site a ZIP file that contains the following:
[...]

Great work to tabulate the results. Hats off.

I ran a few extra tests with the first zip file, i.e. the sheet with the Newton, Halley, Ostrowski, Super Secant, and Hyper Secant methods.

The most difficult functions that I added to the tests are sign(x)*sqrt(abs(x)) and sqrt(abs(x)). The second of these two is impossible to handle with bisection and secant methods, because those methods (and methods such as Brent-Dekker) require bracketing the root by two points x0 and x1, such that f(x0)*f(x1) < 0 (i.e. have opposite signs). For other methods that do not require bracketing, starting at any nonzero guess for x, the methods typically do not converge but bounce around the root x=0 without getting any closer. I came up with these as the most extreme examples of the problem illustrated in Numerical Recipes 2nd ed Ch 9.4 Fig. 9.4.3. The second function is non-differentiable, which adds another twist.

I was surprised to find that Hyper Secant method actually converged for sign(x)*sqrt(abs(x)) starting with the initial guess x=1. This produces x=-0.005306656. The question is, is this result intended? The convergence check may not be accurate perhaps? I would expect the method to continue until x=0 within 1E-9.

A strange result for function sqrt(abs(x)) is produced when starting with the guess x=1. In this case both the Super Secant and Hyper Secant methods loop with the value x=65535 for which f(x) is also 65535. This happens after about 100 iterations. It is odd to see both the Super Secant and Hyper Secant methods lock onto 65535. Probably a VB bug.

Running Halley on a pocket computer (because it's BCD) does produce a series of x values that get successively closer to the root x=0, for both of these functions, when starting with a nonzero guess for x. Newton and Ostrowski completely fail to show any direction for x, but Ostrowski does successively get closer to the root of the function sign(x)*sqrt(abs(x)). Also a non-CAS calculator like the Casio fx-115ES PLUS 2nd ed produces a value close to x=0, but takes significant time to do so. But it did not crash.

I found a minor issue in the VB code of Halley. The loop does not terminate. Adding the check for R>1000 fixed this.

Thanks for posting this and keep up the good work.

EDIT: I neglected to mention that I'm using a smaller h value on my machines, which was also mentioned by previous replies as an improvement. I found that h=sqrt(MachEps)|x| works very well for |x|>1 and h=sqrt(MachEps) for |x|<1, with MachEps the machine epsilon, e.g. h=3E-5 for MachEps=1E-9. You can also fix h to this value before the loop, which is fine as long as x does not change by orders of magnitude in the loop, which should not happen for reasonable initial guesses of x. To prevent the next guess from flying off the scale, you might want to check that |f'(x)|<m|f(x)| for a large value if m, such as m=10^(log10(maxFloat)/2), e.g. m=1E50 or less for maxFloat=1E99. Note that the value of f'(x) is an approximation of the derivative (Quasi-Newton, Halley, etc) or the secant slope.

- Robert

"I count on old friends to remain rational"
Visit this user's website Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 3 Guest(s)