Post Reply 
How Accurate is Accurate Enough?
04-11-2021, 03:12 PM
Post: #5
RE: How Accurate is Accurate Enough?
From a numerical analyst's POV, accuracy is dependent on the problem. There's no mathematical definition. However, one can usually analyze how much precision is needed for a particular problem. There are stable algorithms that can get by with single precision (about 7 digits) but some unstable algorithms cannot ever get close to the answer. Taking the quadratic formula for example, if one uses x1=(B-Sqrt(B^2-4AC)/(2A) and x2=(B+Sqrt(B^2-4AC)/(2A) fails badly. If B is positive, the formula for x1 can be in error (and if B is negative, x2 will be wrong). The point is that if A and C are small (or their product is) compared to B, one essentially tries to compute B-B with poor values for B. In this case, one uses the x1 formula with negative B and formula 2 with positive B and computes the x1=C/(Ax2) or x2=C/(Ax1) to get the other root; the product of the roots is C/A. IF A=1, the B is the sum of the root and C is the product. This occurs in solubility product computations, one gets things like x^2-1.0000001x+.0000001=0 (I think this is right).

Rounding is tricky. With any even base other than 2 (like 10 or 16), it can difficult to compute the average of 2 numbers even if both numbers and their average can be exactly represented. Example: 4 digit arithmetic (base 10), A=6.007, B=6.008, using (A+B)/2 fails with any rounding method. More precision helps but the error still arises.
6.007+6.008=12.015 which rounds to either 12.010 12.02 and dividing by 2 gives 6.005 or 6.010. Either rounding is wrong as the average of 2 numbers has to be no larger than the larger and no smaller than the smaller.
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
RE: How Accurate is Accurate Enough? - ttw - 04-11-2021 03:12 PM



User(s) browsing this thread: 1 Guest(s)