Post Reply 
Newton and Halley's methods with enhanced derivatives estimation
09-30-2017, 08:39 PM
Post: #1
Newton and Halley's methods with enhanced derivatives estimation
I present versions of Newton's method and Halley's method that use enhanced estimations for the first and second derivatives, using five point central difference schemes. Each listing will prompt you for the initial guess for the root and its tolerance value. Line 20 in each listing defines the function whose root you seek.

Here is the enhanced Newton's method

Code:
10 REM NEWTON'S METHOD WITH ENHANCED DERIVATIVE
15 REM NEXT IS THE DEFINITION OF F(X)=0
20 DEF FNX(X)=EXP(X)-3*X^2
30 INPUT "GUESS? ";X
40 INPUT "TOLER? ";T
50 H = 0.001*(1+ABS(X))
60 F0 = FNX(X)
70 F1 = FNX(X+H) @ F2 = FNX(X+2*H)
80 F9 = FNX(X-H) @ F8 = FNX(X-2*H)
90 D1 = (-F2 + 8 * F1 - 8 * F9 + F8) / 12 / H 
100 D = F0 / D1
110 X = X - D
120 DISP X @ WAIT 1
130 IF ABS(D)>=T THEN 50
140 DISP "ROOT = ";X
150 END

Here is the enhanced Halley's method

Code:
10 REM HALLEY'S METHOD WITH ENHANCED DERIVATIVE
15 REM NEXT IS THE DEFINITION OF F(X)=0
20 DEF FNX(X)=EXP(X)-3*X^2
30 INPUT "GUESS? ";X
40 INPUT "TOLER? ";T
50 H = 0.001*(1+ABS(X))
60 F0 = FNX(X)
70 F1 = FNX(X+H) @ F2 = FNX(X+2*H)
80 F9 = FNX(X-H) @ F8 = FNX(X-2*H)
90 D1 = (-F2 + 8 * F1 - 8 * F9 + F8) / 12 / H 
100 D2 = (-F2 + 16 * F1 - 30 * F0 + 16 * F9 - F8) / 12 / H ^ 2
110 D = F0 / D1 / (1 - F0 * D2 / D1 / 2 / D1)
120 X = X - D
130 DISP X @ WAIT 1
140 IF ABS(D)>=T THEN 50
150 DISP "ROOT = ";X
160 END

You are welcome to add iteration counters to also view the number of iterations needed. Moreover, these counters can also help prevent indefinite iterations in case the guess for the function you are using fail to converge.

Enjoy!

Namir
Find all posts by this user
Quote this message in a reply
10-06-2017, 09:55 AM
Post: #2
RE: Newton and Halley's methods with enhanced derivatives estimation
(09-30-2017 08:39 PM)Namir Wrote:  I present versions of Newton's method and Halley's method that use enhanced estimations for the first and second derivatives
(...)
You are welcome to add iteration counters to also view the number of iterations needed.

What? Still no answers after six days ?-) Then let me add a first comment.

First of all, thank you very much for your suggestion of enhanced methods with more accurate derivatives. I tried the Newton method in Excel VBA, and here is what I got:

Indeed the derivative is much more accurate than the classic method based on two points at x and x+h. The approximation sequence actually matches very nicely the one based on the exact derivative. This way often one iteration less is required, there may be even cases where two or more are saved.

But... the number of function calls is much higher than with the classic two-point approach.
Here is an example with your f(x) = exp(x) – 3 · x², based on a tolerance of 1 E–10.

Code:
               Newton method with x0 = -2 and
exact derivative  5-point derivative  2-point derivative
--------------------------------------------------------
-1,0223043336      -1,0223043336       -1,0223090585
-0,5948746572      -0,5948746572       -0,5948783091
-0,4711156755      -0,4711156755       -0,4711167598
-0,4590772551      -0,4590772551       -0,4590773196
-0,4589622780      -0,4589622780       -0,4589622784
-0,4589622675      -0,4589622675       -0,4589622675
-0,4589622675      -0,4589622675       -0,4589622675
--------------------------------------------------------
 14 fn calls        35 fn calls         14 fn calls

               Newton method with x0 = +2 and
exact derivative  5-point derivative  2-point derivative
--------------------------------------------------------
 1,0000000000       1,0000000000        0,9999969874
 0,9141552818       0,9141552818        0,9141554524
 0,9100176658       0,9100176658        0,9100176888
 0,9100075725       0,9100075725        0,9100075726
 0,9100075725       0,9100075725        0,9100075725
                                        0,9100075725
--------------------------------------------------------
 10 fn calls        25 fn calls         12 fn calls

The 2-point derivative was based on an IMHO more suitable definition of h = x/10000 (or 0,00001 for x=0).

You can see that the first two columns agree very nicely, i.e. the 5-point derivative essentially delivers the same results as the exact derivative. So yes, the number of iterations can be slightly improved with a more accurate derivative, but (and this is a big BUT) the number of function calls and thus the required execution time will significantly increase! Since the function has to be called 2,5x more often compared to a classic 2-point derivative (and the calculation of the derivative itself also requires a bit more effort) the gain of one or two less iterations does not outweigh the much longer time it takes for each approximation step. In the above example the simple 2-point method is more than twice as fast.

So... where's the benefit of a more exact derivative?

Dieter
Find all posts by this user
Quote this message in a reply
10-06-2017, 10:52 AM
Post: #3
RE: Newton and Halley's methods with enhanced derivatives estimation
I did not claim the enhanced derivative accuracy were more efficient in the number of function call. In fact, if one is ever to use this approach, then Halley's method is preferable over Newton's method.
Find all posts by this user
Quote this message in a reply
10-06-2017, 11:53 AM (This post was last modified: 10-06-2017 12:35 PM by Dieter.)
Post: #4
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-06-2017 10:52 AM)Namir Wrote:  I did not claim the enhanced derivative accuracy were more efficient in the number of function call. In fact, if one is ever to use this approach, then Halley's method is preferable over Newton's method.

OK, but if the enhanced method is significantly less efficient that the classic 2-point method, why should it be used at all? Maybe I overlooked something here, that's why I'm asking for the benefit of this method. Or, more precisely: what was your motivation here? There must be a reason: people usually do things for a reason. ;-)

Re. the Halley method: I tried the usual 3-point-method for the second derivative and compared this with the results of the enhanced method and those based on the exact first and second derivative. The results are the same as for Newton: The enhanced method nicely matches the results based on exact derivatives, and here and there these two converge in one approximation step less than the standard 3-point method. But since the enhanced derivative requires 5 function calls per iteration vs. just 3 with the standard method, it would have to converge much, much faster (e.g. in only 6 steps where the standard method requires 10). But it doesn't, the number of saved iterations is marginal. So again I do not see any advantage here. But maybe I didn't get the essential point, that's why I'm asking.

BTW, regarding the total number of function calls the standard Newton and Halley methods are very similar (OK, unless you're starting close to a function max/min, here Halley performs better). Comparing the two versions with enhanced derivatives the Halley method has a significant advantage, just as you said. But it's still less efficient than a standard Newton or Halley approach.

Here are some figures for f(x) = exp(x) – 3 x² and tolerance=1E–10, based on exact, enhanced and "standard" evaluation of the derivatives.

Code:
     Total number of function calls
-----------------------------------
     Newton method    Halley method
x0   exact enh std    exact enh std
-----------------------------------
-3     14   35  14      15   25  15
-1     12   30  12      12   20  12
 0     14   35  14      15   25  15
 1      8   20  10       9   15   9
 3     20   50  20      15   25  15
 5     14   35  14      15   25  15

The line for x0=3 shows that the Newton method first leads away from the root at x=3,733 while the Halley method directly approaches the solution.

Dieter
Find all posts by this user
Quote this message in a reply
10-06-2017, 05:39 PM
Post: #5
RE: Newton and Halley's methods with enhanced derivatives estimation
I've found an example which causes your first program to loop forever on the HP-75C. But then my tolerance factor might not be adequate enough:


Code:

10 REM NEWTON'S METHOD WITH ENHANCED DERIVATIVE
15 REM NEXT IS THE DEFINITION OF F(X)=0
17 INPUT "PV? "; P @ INPUT "PMT? "; M @ N=2
20 DEF FNX(X) = 1/(1-X*P/M)^(1/N)-X-1
30 INPUT "GUESS? ";X
40 T=.00000000001
50 H = 0.001*(1+ABS(X))
60 F0 = FNX(X)
70 F1 = FNX(X+H) @ F2 = FNX(X+2*H)
80 F9 = FNX(X-H) @ F8 = FNX(X-2*H)
90 D1 = (-F2+8*F1-8*F9+F8)/12/H 
100 D = F0/D1
110 X = X-D
120 DISP X @ WAIT 1
130 IF ABS(D)>=T THEN 50
140 DISP "ROOT = ";X @ WAIT 2
150 E=(SQR(M+4*P)-3*SQR(M))/(SQR(M)-SQR(M+4*P))
160 DISP "EXACT: ";E
170 END

RUN

PV? 100
PMT? 55
GUESS? 0.1


The program loops around 0.06596460. The exact 16-digit result is 6.596460097781873E-2.

Gerson.
Find all posts by this user
Quote this message in a reply
10-06-2017, 07:27 PM (This post was last modified: 10-06-2017 07:39 PM by Dieter.)
Post: #6
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-06-2017 05:39 PM)Gerson W. Barbosa Wrote:  I've found an example which causes your first program to loop forever on the HP-75C.

This may always happen since, due to limited numeric accuracy, the approximation may oscillate around a certain value with the last digit(s) varying by a few units.

(10-06-2017 05:39 PM)Gerson W. Barbosa Wrote:  But then my tolerance factor might not be adequate enough:

Setting a suitable exit condition indeed is crucial. Since the Newton method has quadratic and the Hally method even cubic convergence, a desired relative error of 1E–12 does not mean that the last approximation has to agree in 12 significant digits with the previous one. If the last two values differ by a relative error of 1E–8, this leads to an error estimate of 1E–16 for Newton and 1E–24 for Halley's method. In theory, that is. ;-)

That's why I prefer an implementation like it is done in some routines in the WP34s firmware that require an iterative approach (e.g. the Lambert W function): If an accuracy near 1E–30 is required, the iteration might exit at, say, 1E–20 (Newton) or even 1E–15 (Halley). If you like you may add one final iteration, just to be sure.

An application similar to yours (finding the interest rate in a TVM problem for the HP65, c.f. HP65/67 Software Library) exits at 1E–7 on a 10-digit calculator. Due to the limited accuracy (the HP65 does not offer ln(1+x) or e^x–1) further digits will be rounded off anyway as soon as 1+i is calculated. So knowing the properties and limitations of the considered equation can be essential.

In your example for a 12-digit machine I would try T=1E–8. Could you give it a try and see how well the returned solution agrees with the true result? Maybe after one additional final iteration?

(10-06-2017 05:39 PM)Gerson W. Barbosa Wrote:  The program loops around 0.06596460. The exact 16-digit result is 6.596460097781873E-2.

With Excel's 15-digit BCD precision it's the enhanced Halley method which, if the set tolerance is too small, oscillates around

0,0659646009778178
0,0659646009778200
0,0659646009778178
0,0659646009778200
...

As you can see even if x varies in its last three digits the value of f(x) is essentially the same, so more than 14 decimals or 13 significant digits do not make much sense. That's why for n-digit precision I would try a relative tolerance near 10^–(2/3*n). Add one final iteration if it makes you feel better. But that's just the way I would do it – you should do your own tests. ;-)

Dieter
Find all posts by this user
Quote this message in a reply
10-06-2017, 08:40 PM (This post was last modified: 10-06-2017 08:43 PM by Gerson W. Barbosa.)
Post: #7
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-06-2017 07:27 PM)Dieter Wrote:  In your example for a 12-digit machine I would try T=1E–8. Could you give it a try and see how well the returned solution agrees with the true result? Maybe after one additional final iteration?

T = 1E-8

Initial guess: 0.1

0.0659646010039

After one additional iteration:

0.0659646009050

Exact 12-digit result:

0.0659646009778

Thank you very much for your valuable lessons!

Gerson.
Find all posts by this user
Quote this message in a reply
10-06-2017, 09:50 PM (This post was last modified: 10-06-2017 09:53 PM by Dieter.)
Post: #8
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-06-2017 08:40 PM)Gerson W. Barbosa Wrote:  T = 1E-8

In the original program T is the allowed absolute error, so my suggestion to set T=1E–8 was, er... "not yet perfect". #-)
It's the relative error that should be compared to this value. So the respective code line should better read

130 IF ABS(D)>=ABS(T*X) THEN 50

Looking at the results you posted...

(10-06-2017 08:40 PM)Gerson W. Barbosa Wrote:  Initial guess: 0.1

0.0659646010039

After one additional iteration:

0.0659646009050

Exact 12-digit result:

0.0659646009778

...the iteration seems to converge slower than expected so that 1E–8 looks like a bit too much. On the other hand it may be just right here: I entered your FNX(X) formula into my 35s (same 12 digit precision), and indeed the function result rounds to zero even if x varies in the last 3 or 4 (!) digits. So anything around 0,065964601 is considered a solution, e.g. both 0,06596460094 and 0,06596460103 return f(x)=0. That's one of the reasons why T should not be chosen too small: even if the last two approximations agree in merely 8 significant digits, additional iterations may not yield a more accurate result since the last digits of x may vary without changing the function result.

Dieter
Find all posts by this user
Quote this message in a reply
10-07-2017, 06:09 AM
Post: #9
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-06-2017 09:50 PM)Dieter Wrote:  It's the relative error that should be compared to this value. So the respective code line should better read

130 IF ABS(D)>=ABS(T*X) THEN 50

BTW, this may cause an infinite loop if X=0: in this case the test is always true. So at least the ">=" should be replaced with ">". Maybe someone has a better idea how to handle this case.

And FTR: with 0,1 for both initial guesses the 35s returns 0,0659646010297. If the final "–X–1" in the equation is changed to "–1–X" the result changes to 0,0659646009800. But as already mentioned there are multiple values around 0,065964601 for which the equation will evaluate to zero on a 12-digit calculator.

Dieter
Find all posts by this user
Quote this message in a reply
10-07-2017, 01:53 PM (This post was last modified: 10-07-2017 02:06 PM by Gerson W. Barbosa.)
Post: #10
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-06-2017 07:27 PM)Dieter Wrote:  An application similar to yours (finding the interest rate in a TVM problem for the HP65, c.f. HP65/67 Software Library) exits at 1E–7 on a 10-digit calculator. Due to the limited accuracy (the HP65 does not offer ln(1+x) or e^x–1) further digits will be rounded off anyway as soon as 1+i is calculated. So knowing the properties and limitations of the considered equation can be essential.

Better results using those functions and your suggestions:

5 DESTROY ALL
10 F=0
17 INPUT "PV? ";P @ INPUT "PMT? ";M @ N=2
20 DEF FNX(X)=EXPM1(-(LOGP1(-X*P/M)/N))-X
30 INPUT "GUESS? ";X
40 T=.00000001
50 H=.001*(1+ABS(X))
60 F0=FNX(X)
70 F1=FNX(X+H) @ F2=FNX(X+2*H)
80 F9=FNX(X-H) @ F8=FNX(X-2*H)
90 D1=(-F2+8*F1-8*F9+F8)/12/H
100 D=F0/D1
110 X=X-D
120 DISP X
125 IF F=1 THEN 150
130 IF ABS(D)>=T THEN 50
140 DISP "ROOT = ";X
145 F=1 @ DISP "       "; @ GOTO 50
150 E=(SQR(M+4*P)-3*SQR(M))/(SQR(M)-SQR(M+4*P))
160 DISP "EXACT: ";E
170 END

>RUN
PV? 100
PMT? 55
GUESS? .1
7.57351446226E-2
6.72097906242E-2
6.59898111158E-2
6.59646116921E-2
6.59646009784E-2
6.59646009725E-2
ROOT = 6.59646009725E-2
       6.59646009765E-2
EXACT: 6.59646009789E-2     (78)


PS: On the HP-71B.
Find all posts by this user
Quote this message in a reply
10-07-2017, 06:11 PM
Post: #11
RE: Newton and Halley's methods with enhanced derivatives estimation
Thank you Namir! I always love your presentations!
Visit this user's website Find all posts by this user
Quote this message in a reply
10-07-2017, 09:18 PM
Post: #12
RE: Newton and Halley's methods with enhanced derivatives estimation
(10-07-2017 06:11 PM)Eddie W. Shore Wrote:  Thank you Namir! I always love your presentations!

You are welcome Eddie!

Namir
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 5 Guest(s)