Post Reply 
On Convergence Rates of Root-Seeking Methods
03-11-2018, 09:50 PM
Post: #34
RE: On Convergence Rates of Root-Seeking Methods
I'd go with Brent's method (or any variants that look good to you.) The problem with high order methods is that their radius of convergence may be too small to be useful. Brent's method alternates inverse-iteration, the secant method, and bisection, to achieve about 1.84 degree guaranteed convergence. There are probably ways to incorporate Newton's method if suitable regions can be found. This kicks things up a little in convergence rate. A problem with derivatives is that these get rather complicated in several dimensions; the slope becomes a gradient (vector valued) and the second derivative becomes the Hessian Matrix. One can look at something like the Levenberg-Mardquart method which can be useful but requires user-set parameters. I think Brent's method needs little but the function and a couple of points where the function is of different signs.
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
RE: On Convergence Rates of Root-Seeking Methods - ttw - 03-11-2018 09:50 PM



User(s) browsing this thread: 3 Guest(s)