Peeking at different interpolation algorithms

10212017, 03:00 PM
(This post was last modified: 10212017 08:38 PM by Namir.)
Post: #1




Peeking at different interpolation algorithms
I have been looking at various difference and divideddifference interpolation algorithms. In addition to the wellknown Newton ForwardDifference, BackwardDifference, ForwardDividedDifference, and BackwardDivideDifference methods, little else, of difference methods, is covered in most numerical analysis books. The difference methods rely on equally spaced x values and are able to simplify the equations used and calculation schemes. The divideddifference methods handle nonequidistant values of x. In either case of difference and divided difference, you build a table of derivative estimates. The table entries look like a triangle laying on its side with a rectangular base (comprising of the (x, y) values). The triangle is made up of columns. The first column approximate the first derivatives, the second column approximate the second derivatives, and so on. Each column has one less row than the column to its left. The estimate of these derivative use the difference (in the case of equidistant x values) or divided difference (in the case of nonequidistant x values). The Newton forward difference and divided difference schemes use the top table values of the derivatives in the interpolation polynomial. Conversely, the Newton backward difference and divided difference schemes use the bottom table values of the derivatives in the interpolation polynomial. I stumbled on a numerical analysis book sitting on my bookshelf (Elements of Numerical Analysis by Radhey S. Gupta) and it covered several difference methods for interpolation. They are the Gauss forward, Gauss backward, Stirling, Bessel, Everett, and Steffensen methods. Without going into details (you can read about Gauss and Stirling methods on the Internet), all of these methods tap into the various derivative estimates along and around the central axis of the difference table (remember the table is a triangle leaning on its side). Each of these methods suggests a special scheme to pick the values of the derivatives in the difference table. Using Excel, I was able to build a Gauss forward divided difference (by studying its simpler difference counterpart) to interpolate values for data with nonequidistant x values. I tested data for both equidistant and nonequidistant values of x. The results are interesting: 1. The errors are smaller when we interpolate for x near the mid table values. The errors increase the more the interpolated x deviates from the mid table values. 2. The above is true for both equidistant and nonequidistant values of x. In the case of equidistant values of x, the error increases as the difference between neighboring x’s increases. In the case of nonequidistant values of x, the error increases with the average spacing of the x value. So should we ditch Newton’s difference and divideddifference methods in favor of these new found little darling interpolation algorithms? The answer is no, not necessarily. Why? These algorithms use midtable values and may not do well if you are interpolating x values that are not so close to the midtable x values. This is their Achilles heal! Consider a large table of x and y values at nonequidistant values of x. If you want to interpolate for xi, then lookup in the table and locate the first two values of x that make up an interval that contains the value of xi. Then select a few more x values (along with their associated y values) down in the table, build the Newton Forward DividedDifference table, and finally perform the interpolation. You should get good results based on the number of points you have selected and the information represented by the (x, y) values (i.e. the true function that is sampled by the table’s data). If the xi value is near or at the end of the table, then use the Backward DividedDifference. Select two values of x that that make up an interval that contains the value of xi. Select a few values of (x, y) up in the table, build the backward divideddifference table, and perform the interpolation. So, armed with the Newton Forward and Backward Difference and DividedDifference, as well as Lagrange (my favorite!) methods, you are good for most of your Interpolation needs. There are other new algorithms like Neville and Thiele interpolation that can explore and use. 

08022019, 09:27 PM
(This post was last modified: 08042019 01:43 PM by Albert Chan.)
Post: #2




RE: Peeking at different interpolation algorithms
With divided difference, there is no requirement that the table be sorted.
Just do Forward Divided Difference, with the entries closest to interested region up on top. Example, from https://jp.mathworks.com/matlabcentral/f...onformula (note: the link had the entries entered wrong, below is corrected points) Points: (2, 18.4708), (1, 17.8144), (0, 17.107), (1, 16.3432), (2, 15.5154), find y(0.25): Assume calculations with 6 sig. digits, and we want best interpolated values in the middle: Code: p y(p) Divided Difference Table y(p) = 17.107 + (p0)*(0.7638 + (p1)*(0.0282 + (p+1)*(0.0012665 + (p2)*(0.000091585)))) → y(0.25) ≈ 16.9216 Above formula is same (except slight rounding errors) as Gauss Forward Formula: Code: p y(p) Difference Table y(p) = \(17.107 + \binom{p}{1}(0.7638) + \binom{p}{2}(0.0564) + \binom{p+1}{3}(0.0076) + \binom{p+1}{4}(0.0022) \) 

08042019, 01:28 PM
(This post was last modified: 08042019 01:44 PM by Albert Chan.)
Post: #3




RE: Peeking at different interpolation algorithms
Perhaps an easier way to produce Forward Divided Difference polynomial:
New Formulas and Methods for Interpolation, Numerical Differentiation and Numerical Integration Redoing post #2 with the New Divided Difference Table, again assume 6 sig. digits calculations Code: p y(p) New Divided Difference Table Top entries are "locked", and use to compute the slope (divided difference). 1st column = [ (y17.107) / (x0) for (x,y) in [[1,16.3432],[1,17.8144],[2,15.5154],[2,18.4708]]] 2nd colume = [ (y+0.7638)/(x1) for (x,y) in [[1,0.7074],[2,0.7958],[2,0.6819]]] ... y(p) = 17.107 + (p0)*(0.7638 + (p1)*(0.0282 + (p+1)*(0.00126667 + (p2)*(0.0000916675)))) This interpolation scheme were used by Acton Forman's Numerical Method that Work. The book were originally published in 1970, so perhaps above is not that "New" https://www.hpmuseum.org/forum/thread11...#pid102623 

08102019, 08:49 PM
Post: #4




RE: Peeking at different interpolation algorithms
(08022019 09:27 PM)Albert Chan Wrote: We can build interpolation formula on the fly, for any path, without looking up. Just think of what the interpolation X values sequence. Then assume the list were originally sorted the same way, with numbers on the diagonal. For Above Gauss Forward Formula, X's = 0, 1, 1, 2, 2 y(p) = 17.107 + (p0)/1*(0.7638 + (p1)/2*(0.0564 + (p+1)/3*(0.0076 + (p2)/4*(.0022))))) Had we wanted Gauss Backward Formula, X's = 0, 1, 1, 2, 2 y(p) = 17.107 + (p0)/1*(0.7074 + (p+1)/2*(0.0564 + (p1)/3*(0.0054 + (p+2)/4*(.0022))))) 

« Next Oldest  Next Newest »

User(s) browsing this thread: