Post Reply 
Reducing the Number of Slope Evaluations for Newton's Method
11-11-2024, 04:48 PM (This post was last modified: 11-13-2024 03:39 PM by Namir.)
Post: #1
Reducing the Number of Slope Evaluations for Newton's Method
I introduce two HP Prime programs that solve the roots of nonlinear functions using Newton’s method. What is different about these programs is that they reduce the number of function calls used in calculating the slope. The HP Prime programs calculate the slope using the forward divided difference method:

Code:
f’(x) = (f(x+h) – f(x))/h                                                           (1)

Where h = 0.001*(1 + |x|). Thus, calculating the slope requires two calls to the nonlinear function f(x)=0. If you re-implement the code to use a separate function to calculate the slope, that function should also count as a function call. So, either way a regular Newton method iteration requires two function calls.

Function Root1

The first function, Root1, reduces the number of function calls, by using a counter k and maximum counter max, and the following steps:

• The first two iterations refine the guess for the root x by using equation (1) which requires two function calls per iteration. The values for k and max are initially set to zero and then to 1 after the first iteration.
• During the third iteration k has the value of 1 and max has the value of 2. Since max-k is 1, the root-seeking iteration skips the step to explicitly calculate a new slope and uses the previous value of the slope. This scheme saves one call to function f(x).
• During the fourth iteration k has the value of 2 and max has the value of 2. Since max-k is 0, the root-seeking iteration calculates an updated slope value by calling f(x) twice. The variables max is set to 3 and the variable k is set to 1.
• The next two iterations will skip updating the slope and use the current slope value.
• The seventh iteration will have max-k = 0 and evaluates a new slope value.
• The remaining iterations continue with incrementing max by 1 when k equals max, and resetting k to 1. Otherwise, the iterations simply increment k by 1 until k equals max.
• The iterations stop when the absolute value for the refined guess for the root is les than the root tolerance value.

Here is the HP Prime code with the test function f(x)=exp(x) – 3*x^2:

Code:
EXPORT fx(x)
BEGIN
  RETURN exp(x)-3*x*x;
END;

EXPORT Root1(x,toler)
BEGIN
  LOCAL h, f, fh, deriv, diff;
  LOCAL count, k, max;
  diff:=2*toler;
  count:=0;
  k:=0;
  max:=0;
  WHILE ABS(diff)>toler DO
      f := fx(x);
      count := count + 1;
      IF k == max THEN
        max := max + 1;
        k := 1;
        h := 0.001 * (1 + ABS(x));
        fh := fx(x + h);
        count := count + 1;
        deriv := (fh - f) / h;
      ELSE
        k := k + 1;
      END;
    
      diff := f / deriv;
      x := x - diff; 
  END;
 
  RETURN [x, count];

END;
The function Root1 has the parameters x and toler. The parameter x is the initial guess for the root. The parameter toler is the root tolerance value. The function returns x, the refined root value, and count, the number of function calls.

The function Root1 works better as the initial guess for the root is close to the actual root value. Using the HP Prime you can easily graph the function and locate the target root to supply a close initial guess for the root.

Here are two sample calls to function Root1:

Code:
Root1(3,0.00000001) -> [3.73307902876,20]
Root1(4,0.00000001) -> [3.73307902865,9]

Function Root2
The second function, Root2, uses a different approach. The function maintains iterations that update the value of the slope, until the absolute relative values of two successive slopes are within a slope tolerance value. Once this condition is reached, the iterations use the last calculate slope until the iterations obtain the sought refined guess for the root.

Here is the HP Prime code with the test function f(x)=ex – 3*x2:

Code:
EXPORT fx2(x)
BEGIN
  RETURN exp(x)-3*x*x;
END;

EXPORT Root2(x,toler,fxtoler)
BEGIN
  LOCAL h, f, fh, deriv, diff;
  LOCAL count, lastDeriv, bStopDerivCalc;
  diff:=2*toler;
  count:=0;
  fxtoler:=fxtoler/100;
  bStopDerivCalc:=0;
  lastDeriv:=10^99;
  WHILE ABS(diff)>toler DO
      f := fx2(x);
      count := count + 1;
      IF bStopDerivCalc == 0 THEN
        h := 0.001 * (1 + ABS(x));
        fh := fx2(x + h);
        count := count + 1;
        deriv := (fh - f) / h;
        IF ABS((deriv-lastDeriv)/deriv) < fxtoler THEN
          bStopDerivCalc:=1;
        END;
        lastDeriv:=deriv;
      END;
    
      diff := f / deriv;
      x := x - diff; 
  END;
 
  RETURN [x, count];

END;

The function Root2 has parameters x, toler, and fxtoler. The parameter x is the initial guess for the root. The parameter toler is the root tolerance value. The parameter fxtoler is the percentage of tolerance for the absolute relative slope values. The function returns x, the refined root value, and count, the number of function calls.


Here are two sample calls to function Root2:

Code:
Roo2(4,0.0000001,5) -> [3.73307902863,10]
Roo2(3,0.0000001,5) -> [3.73307902866,18]
Find all posts by this user
Quote this message in a reply
11-12-2024, 12:43 AM
Post: #2
RE: Reducing the Number of Slope Evaluations for Newton's Method
For comparison, if Root1 re-use old slope is *disabled*, result is not much different.
Functions per iteration is higher, but it now take less iterations.

Root1(3, 1e-8)      --> [3.73307902866, 20]
Root1(4, 1e-8)      --> [3.73307902863, 12]
Find all posts by this user
Quote this message in a reply
11-12-2024, 01:04 AM (This post was last modified: 11-15-2024 12:55 AM by Albert Chan.)
Post: #3
RE: Reducing the Number of Slope Evaluations for Newton's Method
Another comparison, using central difference formula for slope (and f(x) itself!)
It may also re-use old slope, but only if new x within x ± h, a safer 'interpolated' slope.

(HP71B) Newton's method

1 DESTROY ALL @ C=0
2 DEF FNF(X)=EXP(X)-3*X^2
3 INPUT "Guess, Accuracy = ";X,A
10 FOR I=1 TO 30 @ DISP X
20 E=.0005*(1+ABS(X))
30 F1=FNF(X-E) @ F2=FNF(X+E)
40 R=2*E/(F2-F1) ! reciprocal slope
50 D=(F1+F2)/2*R ! secant root = X-D
60 C=C+2 @ IF ABS(D)<E THEN D=D+FNF(X-D)*R @ C=C+1
70 X=X-D @ IF ABS(D)<A THEN EXIT I
80 NEXT I
90 DISP X @ DISP "FNF calls =";C

>run
Guess, Accuracy = 3, 1e-8
 3
 6.31540065127
 5.4741132932
 4.75161204345
 4.20110692107
 3.86870519365
 3.747908432
 3.73327360951
 3.73307902676
 3.7330790286
FNF calls = 20

>run
Guess, Accuracy = 4, 1e-8
 4
 3.78435658259
 3.73537396758
 3.73307902732
 3.7330790286
FNF calls = 10

If old slope correction not treated as separate iteration (*)
For (3,1e-8): f calls per iteration = 20 / 9 ≈ 2.222
For (4,1e-8): f calls per iteration = 10 / 4 ≈ 2.500

High f calls per iteration, but less iterations to do.

(*) If old slope correction were treated separately, we may save some iterations.
https://www.hpmuseum.org/forum/thread-20...#pid195583
Find all posts by this user
Quote this message in a reply
11-12-2024, 01:35 AM
Post: #4
RE: Reducing the Number of Slope Evaluations for Newton's Method
Using the central difference will require three function calls per iteration. My goal is to reduce the function calls.

Namir
Find all posts by this user
Quote this message in a reply
11-12-2024, 02:13 AM
Post: #5
RE: Reducing the Number of Slope Evaluations for Newton's Method
(11-12-2024 01:35 AM)Namir Wrote:  Using the central difference will require three function calls per iteration. My goal is to reduce the function calls.

You can do it with 2 calls, for both f(x) and f'(x), see my HP71B code.

f1, f2 = f(x-h), f(x+h)

f(x) = (f2+f1)/2 + O(h^2)
f'(x) = (f2-f1)/(2h) + O(h^2)

dx = f(x)/f'(x) ≈ (f2+f1)/2 * (2h)/(f2-f1) = (f2+f1)/(f2-f1) * h
Find all posts by this user
Quote this message in a reply
11-12-2024, 10:21 AM (This post was last modified: 11-12-2024 10:46 AM by C.Ret.)
Post: #6
RE: Reducing the Number of Slope Evaluations for Newton's Method
Hi,

Thanks to both of you for this interesting topic about Newton's Method to find the roots of a function.
Indeed, having to determine the derivate can induce more numerous evaluations of the function it self. But the idea is that a good determination will certainly lead more directly to the root.

Please let me share below my version of Namir's code that I modified so that the definition of the function is in the F1 slot of the Function Application.
In doing so, I said to myself that on an HP Prime, once the function is entered in the F1 slot, it is very easy to symbolically determine the derivative and store it in the F2 slot in order to facilitate the progress of the algorithm. We then save a good chunk of code and variables. However, my counter then counts the number of evaluations of F1(x) and F2(x). In a way, my counter runs half as fast as Namir's.


EXPORTNewton(x,tol)
BEGIN
  LOCAL dx:=2*tol, cnt:=0;
  F2 := REPLACE(STRING(CAS("diff(F1(z),z)")),"z","X");
  WHILE ABS(dx)>tol DO
      dx :=F1(x) / F2(x);
      cnt := 1+cnt;
      IF cnt>99 THEN BREAK(2); END;
      x := x - dx; 
  END;
  RETURN [ x, cnt ];
END;


I don't know if all this makes sense, on the HP Prime, other root-seeking methods exists.
The only advantage to my code is that it is possible to add graphic objects in order to have a pedagogical illustration of the method.

Actually, the unique advantage is to easily change from one function to another; just edit the F1 slot.

[Image: attachment.php?aid=14286] [Image: attachment.php?aid=14287]

Best regards.
C.Ret


Attached File(s) Thumbnail(s)
       
Find all posts by this user
Quote this message in a reply
11-12-2024, 12:27 PM (This post was last modified: 11-13-2024 04:54 AM by Namir.)
Post: #7
RE: Reducing the Number of Slope Evaluations for Newton's Method
C.Ret, nice HP-Prime code using CAS. Your statment:

cnt := 1 + cnt;

Should be:

cnt := 2 + cnt;

because it counts calls to the function AND its derivative.

Your implementation is for a typical Newton's method.

The reduction in function calls makes more sense when the function has a high computational cost, such as a summation series or product with a high number of terms/factors.

Conside the function:

Code:
EXPORT fx(x)
BEGIN
  LOCAL y, i, n, y0;
  n := 10^7;
  y0 := 9;
  y := 0;
  FOR i FROM 1 TO n DO
   y := y + 1/i/x;
  END;
  RETURN y0 - y;
END;

And solve for, x to find the value of x that makes the 10^7 summations equal to 8. The answer is x = 1.855 and it does take a while for each function call to perform ten million loop iterations! This is an example where reducing the number of function calls pays off.

Namir
Find all posts by this user
Quote this message in a reply
11-13-2024, 05:06 PM
Post: #8
RE: Reducing the Number of Slope Evaluations for Newton's Method
(11-12-2024 12:27 PM)Namir Wrote:  And solve for, x to find the value of x that makes the 10^7 summations equal to 8 (typo, should be 9, from code). The answer is x = 1.855 and it does take a while for each function call to perform ten million loop iterations!
This is an example where reducing the number of function calls pays off.

1st, we should make curve as straight as possible 1/x should be flipped

Σ(1/(i*x), i=1..n) = y0      → 1/Σ(1/(i*x), i=1..n) = 1/y0

This patch would speed up convergence greatly.

< RETURN (y0 - y);
> RETURN (y0 - y) / (y0*y);

2nd, x can be pulled out, remaining sum is just harmonic number.
We should not have f(x) keep calculating the same constant.

x / H(n) = 1/y0
x = H(n) / y0

3rd, no need to sum 10 million terms, we can estimate H(huge n)

(08-28-2020 09:26 PM)Albert Chan Wrote:  \(\qquad\qquad\exp( \psi(x+1/2)) = x
+\frac{1}{24 \cdot x
+\Large\frac{1}{\frac{10}{37} \cdot x
+\frac{1}{\frac{689976}{74381} \cdot x \;
+\; \cdots}}} \)

H(n) = Ψ(n+1) + gamma = Ψ((x=n+0.5) + 0.5) + gamma

lua> n = 1e7
lua> x = n + 0.5
lua> gamma = 0.5772156649015329
lua> Hn = log(x + 1/(24*x + 3.7/x)) + gamma
lua> Hn
16.69531136585985
lua> Hn / 9
1.8550345962066501
Find all posts by this user
Quote this message in a reply
Post Reply 




User(s) browsing this thread: 1 Guest(s)