New Root-Seeking Algorithms
|
04-02-2017, 09:28 PM
Post: #1
|
|||
|
|||
New Root-Seeking Algorithms
I am happy to post two new root-seeking algorithms. They are the Super Secant and Hyper Secant methods. These two methods are based on the legacy secant methods (which are rough approximation of Newton's method) that use multiple guess refinement per iteration.click here to download a ZIP file that contains a PDF document and an Excel file that shows how these algorithms work compared to the methods Newton, Halley, and Ostrowski.
Enjoy! Namir |
|||
04-02-2017, 09:54 PM
Post: #2
|
|||
|
|||
RE: New Root-Seeking Algorithms
Nice work! I have to read it slowly, I just had a glance.
May I make some remarks about formatting and packaging? Both for the next work, since you publish a lot. For the format, could you use "alignment justified" or "justified text"? It is more pleasant to read. For the packaging, while the file is obviously intended for who has a computer, reading a pdf from other platforms is not so rare. So could be possible to have the package (excel+pdf) and the pdf duplicated so one can access at least directly to the pdf? If you want to keep the package, there is the possibility to embed files in a pdf, a free tool that does this is the pretty neat pdftk server (here the manual) Mine are just suggestions, not a critics. What is important is the content of the file. Wikis are great, Contribute :) |
|||
04-02-2017, 11:12 PM
Post: #3
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-02-2017 09:54 PM)pier4r Wrote: Nice work! I have to read it slowly, I just had a glance. I write the document in MS Word and hwn I am ready to publish save it as a PDF. So yo are suggesting to use "alignment justified" in Word? Namir |
|||
04-02-2017, 11:15 PM
Post: #4
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-02-2017 09:54 PM)pier4r Wrote: Nice work! I have to read it slowly, I just had a glance. I tested "alignment justified" on the Word document and I agree with you that it looks nicer!! Namir |
|||
04-03-2017, 09:23 PM
Post: #5
|
|||
|
|||
RE: New Root-Seeking Algorithms
Have you tried any of these new methods on an HP-67 ?
Regards, Bob |
|||
04-03-2017, 09:32 PM
Post: #6
|
|||
|
|||
RE: New Root-Seeking Algorithms
Excellent work, Namir!
I'm looking forward to hearing your presentation in September. |
|||
04-04-2017, 08:43 AM
(This post was last modified: 04-04-2017 08:49 AM by emece67.)
Post: #7
|
|||
|
|||
RE: New Root-Seeking Algorithms
Thanks again, Namir, for your work.
As you know, I'm quite parcial about the Ostrowsky method. When I read your paper I was surprised by its mediocre performance. Thus, I decided to check what was the problem with it. In your code, the derivative is approximated as the ratio of two increments, but the constant you use (0.01 in the computation of h as h = 0.01 * (1 + Abs(X))) is way high for the Ostroswky method, you need a much smaller one. Changing to h = 3.0e-7 * (1 + Abs(X)) (a nice value if the floats are 64 bits, as I think they are in VB), you will see (*) that the Ostrowsky method numbers in the tables turn red in all test cases except 2: Custom1 (but it no longer fails, it's now 17-53, same iterations but 2 more function evaluations than your hyper-secant method) and equation 6 with x0 = 1 (it's now 15-45, second behind Halley). Perhaps the other methods in this comparison may also benefit from such change in the computation of h. Your approach in the Hyper-Secant method looks really interesting for me. Regards. (*) I've performed such computations in Python with a precision of 15 digits, In VB the results may be different. In any case, my Python code returned the very same results (for number of iterations and total function calls) for all test cases in your table, so I am confident about my statement about the change in the computation of h. Also, the Python code does some sanity checks (as to not to divide by 0 and so on) anticipating problems such the derivative going to 0. I'm not sure at all if the VB code may have problems of such kind when the constant is changed. |
|||
04-04-2017, 09:22 AM
Post: #8
|
|||
|
|||
RE: New Root-Seeking Algorithms
I've been playing with root finding methods for about 50 years or so, so I thought I'd post some links to newer papers on the subject.
https://arxiv.org/pdf/1702.03174.pdf (Using multistep integrators to find roots) https://arxiv.org/pdf/1505.05573.pdf (Newton's method in function spaces) https://arxiv.org/pdf/1501.05033.pdf (All roots of polynomials) https://arxiv.org/pdf/1410.2202.pdf (Newton Ellipsiod method) https://arxiv.org/pdf/1309.4734.pdf (Inexact Newton's method) https://arxiv.org/pdf/1112.6263.pdf (Root finding in Boolean Algebra) https://arxiv.org/pdf/1110.3430.pdf (Errors in the inexact Newton method) https://arxiv.org/pdf/1109.2503.pdf (Quaternion equations) https://arxiv.org/pdf/1004.3412.pdf (Brent's method) https://arxiv.org/pdf/1308.4217.pdf (Another all roots method) |
|||
04-04-2017, 09:33 AM
Post: #9
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-04-2017 09:22 AM)ttw Wrote: I've been playing with root finding methods for about 50 years or so, so I thought I'd post some links to newer papers on the subject.Thanks for sharing! Wikis are great, Contribute :) |
|||
04-04-2017, 10:11 AM
Post: #10
|
|||
|
|||
RE: New Root-Seeking Algorithms | |||
04-04-2017, 10:14 AM
Post: #11
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-04-2017 09:22 AM)ttw Wrote: I've been playing with root finding methods for about 50 years or so, so I thought I'd post some links to newer papers on the subject. Wow!! Thanks for the links. I will check each nd every one of them. Namir |
|||
04-04-2017, 01:13 PM
(This post was last modified: 04-04-2017 01:17 PM by Namir.)
Post: #12
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-04-2017 08:43 AM)emece67 Wrote: Thanks again, Namir, for your work. Cesar, Thank you so much for your comments. I use h=0.01 *(1+|x|) in fear that much smaller values would cause computational errors. By this I mean the accuracy of vintage calculators may give a slope of zero if h is way too small. Obviously I am wrong. I think replacing 0.01 with smaller value for all the methods should be interesting. I think I am going to compare how reducing 0.01 repeatedly by a factor of 10 affect the iterations of at least the Newton method (maybe include Halley and Ostrowski too) and see how it affects the number of iterations and number of functions needed to reach a refined guess for the root, for a given tolerance value. Namir |
|||
04-04-2017, 02:20 PM
Post: #13
|
|||
|
|||
RE: New Root-Seeking Algorithms
Another possibility is to use variable step lengths. I don't have a quick rule of thumb, but there are some in the various references. The idea is to automatically adjust step length as the computation proceeds. This is often with the Levenberg-Marquardt method for multidimensional problems.
If error estimation is easy, one just increases the step length until things don't work then backs off. (There are good discussions on the Wiki for the Nelder-Mead Creeping Simplex optimization method.) I used to lengthen by 3 and shrink by 2 as the actual numbers don't matter; one eventually gets 3^n/2^m with n stretches and m shrinks. I also found a new reference: http://citeseerx.ist.psu.edu/viewdoc/dow...1&type=pdf |
|||
04-04-2017, 02:39 PM
(This post was last modified: 04-04-2017 02:43 PM by emece67.)
Post: #14
|
|||
|
|||
RE: New Root-Seeking Algorithms
You will also need to modify the stopping criterion as, simply relaying on the difference between the last two root estimations is not adequate. The convergence is, sometimes, so fast that at the 2nd or 3rd iteration, although being Abs(X - LastX) greater than Toler, the function does indeed evaluate to 0.
I ended up with: Code:
I can confirm that the other methods do also benefit from decreasing the constant in the computation of h (your Hyper-Secant included. I've not tested the Super-Secant). |
|||
04-05-2017, 09:47 PM
Post: #15
|
|||
|
|||
RE: New Root-Seeking Algorithms
Hi Namir,
here is an HP-41 program that uses quadratic interpolation to find a root of f(x) = 0 It takes 3 guesses in registers X Y Z and returns a root x in X-register and f(x) in Y-register ( which should be a "small" number ) if flag F02 is clear. It should also find double roots. If F02 is set, "SLV2" tries to find an extremum. In both cases, R00 = function name is to be initialized. Here is the listing: 01 LBL "SLV2" 02 STO 01 03 RDN 04 STO 02 05 X<>Y 06 STO 03 07 XEQ IND 00 08 STO 06 09 RCL 02 10 XEQ IND 00 11 STO 05 12 LBL 01 13 VIEW 01 14 RCL 01 15 XEQ IND 00 16 STO 04 17 RCL 02 18 RCL 03 19 - 20 * 21 ENTER 22 STO 07 23 RCL 01 24 RCL 03 25 - 26 STO 08 27 STO 10 28 ST* Z 29 RCL 05 30 * 31 ST* 08 32 - 33 RCL 02 34 RCL 01 35 - 36 STO 09 37 ST- 10 38 ST* Z 39 RCL 06 40 * 41 ST* 09 42 - 43 STO 06 44 * 45 RCL 07 46 RCL 10 47 * 48 RCL 08 49 - 50 RCL 09 51 + 52 2 53 / 54 STO 10 55 X^2 56 + 57 FC? 02 58 X<0? 59 GTO 02 60 SQRT 61 RCL 10 62 SIGN 63 * 64 RCL 10 65 + 66 GTO 03 67 LBL 02 68 RCL 10 69 CHS 70 RCL 06 71 LBL 03 72 X#0? 73 / 74 RCL 01 75 + 76 X<> 01 77 X<> 02 78 STO 03 79 RCL 04 80 X<> 05 81 STO 06 82 RCL 01 83 RCL 02 84 X#Y? 85 GTO 01 86 RCL 04 87 X<>Y 88 END ( 113 bytes / SIZE 011 ) Number of function evaluations = 2 + number of iterations. Best wishes, JM. |
|||
04-05-2017, 10:00 PM
Post: #16
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-05-2017 09:47 PM)JMBaillard Wrote: Hi Namir, Thank you JM. A few years ago I presented at one of the HHC conferences an algorithm that used inverse quadratic Lagrangian interpolation to find the root of a function. I came to realize that there was some "politics" among mathematicians. For some reason many avoided methods like inverse quadratic Lagrangian interpolation to find the roots. Namir |
|||
04-06-2017, 01:51 AM
Post: #17
|
|||
|
|||
RE: New Root-Seeking Algorithms
There are a couple of more root finders that I have used. One is Brents's algorithm (inverse quardratic interpolation with bisection) and the other is the "Illinois" algorithm (which I heard of long before the published work.) There is another modification of Newton's method that raises its effective rate (to Sqrt(8)). The idea is to evaluate f(x) and f'(x) at different xs. (http://www.sciencedirect.com/science/art...30)There's also Chebychev's method which is (like Halley's) a Taylor series; Chebychev used the series and Halley used the continued fraction for the series. Also a guy named Galindo did rather well with bunches of tests and algorithms.
|
|||
04-06-2017, 03:17 AM
Post: #18
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-06-2017 01:51 AM)ttw Wrote: There are a couple of more root finders that I have used. One is Brents's algorithm (inverse quardratic interpolation with bisection) and the other is the "Illinois" algorithm (which I heard of long before the published work.) There is another modification of Newton's method that raises its effective rate (to Sqrt(8)). The idea is to evaluate f(x) and f'(x) at different xs. (http://www.sciencedirect.com/science/art...30)There's also Chebychev's method which is (like Halley's) a Taylor series; Chebychev used the series and Halley used the continued fraction for the series. Also a guy named Galindo did rather well with bunches of tests and algorithms. You link is confusing my browser! Can you please send a link that works. I would like to read the material you are pointing to. |
|||
04-06-2017, 05:11 AM
Post: #19
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-06-2017 03:17 AM)Namir Wrote: You link is confusing my browser! Can you please send a link that works. I would like to read the material you are pointing to. The URL ran into the following text. It should be: http://www.sciencedirect.com/science/art...5913002930 |
|||
04-06-2017, 09:05 AM
Post: #20
|
|||
|
|||
RE: New Root-Seeking Algorithms
(04-06-2017 05:11 AM)DMaier Wrote:(04-06-2017 03:17 AM)Namir Wrote: You link is confusing my browser! Can you please send a link that works. I would like to read the material you are pointing to. Thank you very much for the link. The article itself has more pdf links to other free articles. I also google-searched for articles with pdf-for-purchase, mentioned in the reference area, and found even more pdf articles to download. So thank you for the link as it lead to a bonanza of articles! Namir |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 2 Guest(s)