Lagrangian Interpolation
|
03-08-2015, 07:58 PM
(This post was last modified: 12-27-2023 12:56 AM by Thomas Klemm.)
Post: #6
|
|||
|
|||
RE: Lagrangian Interpolation
This program uses the method mentioned in my previous post.
Memory Map \( \begin{matrix} I & : & x \\ R_0 & : & x_1 \\ R_1 & : & x_2 \\ R_2 & : & x_3 \\ R_3 & : & y_1 \\ R_4 & : & y_2 \\ R_5 & : & y_3 \\ R_6 & : & w_1 \\ R_7 & : & w_2 \\ R_8 & : & w_3 \\ R_{S4} & : & \Sigma x \\ R_{S8} & : & \Sigma xy \\ \end{matrix} \) Code: 001 31 25 11 : LBL A Explanation Lines 001-016 are the same as in Namir's orginal program. Lines 017-036 calculate the weights \(w_j = \frac{1}{\prod_{i=0,i \neq j}^k(x_j-x_i)}\). However this formula is slightly changed so that we only need the differences \(x_1-x_2\), \(x_2-x_3\) and \(x_3-x_1\): \(w_1=\frac{-1}{(x_1-x_2)(x_3-x_1)}\) \(w_2=\frac{-1}{(x_2-x_3)(x_1-x_2)}\) \(w_3=\frac{-1}{(x_3-x_1)(x_2-x_3)}\) As you can see the same factor can be used in two of the weights. Nothing fancy it's just to get the sign correct. That's why we start with \(-1\) in lines 017-021. Lines 043-063 recall \(y_j\) and calculate \(\frac{w_j}{x-x_j}\) and then use \(\Sigma +\) to calculate both \(\sum_{j=0}^k \frac{w_j}{x-x_j}\) and \(\sum_{j=0}^k \frac{w_j}{x-x_j}y_j\) in one single step. Lines 065-067 finally recall these values and calculate \(\frac{\Sigma xy}{\Sigma x}\). Cheers Thomas For those with a HP-15C here's the corresponding program: Code: 001 - 42,21,11 LBL A Added a card of the program that can be used with Jacques Laporte's HP-67 Microcode Simulator. |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 1 Guest(s)