Is super-accuracy matters?
|
10-20-2023, 12:28 AM
(This post was last modified: 10-20-2023 12:29 AM by Peter Klein.)
Post: #24
|
|||
|
|||
RE: Is super-accuracy matters?
I’m curious: How many digits are enough, in various real-world fields?
I notice that if I convert the distance halfway around the earth from km to miles, the difference between using 1.61 and 1.609344 is a whopping 5.25 km. Should I care? How many digits did we need to go to the moon? To design a car engine, or the robotics in the car factory? To design a smartphone? The lenses in my glasses? The tomography machine that can measure the curvature of my retina? To navigate a ship or airplane? To measure spectroscopically the light from a distant star or planet and determine useful things about its composition? To determine that yes, you really did detect a Higgs Bosun or a gravity wave? True story: My father was a theoretical physicist (classical physics, not nuclear). He used a slide rule from the 1930s through the early 70s. Then he got an early TI calculator that was, IIRC, only accurate to 4 or 5 decimal places when doing certain trig and log calculations. He was quite happy with for years. If he needed more accuracy, he punched Fortran programs on cards and sent them to the big computer at work. My mother, OTOH, could not understand why, when she divided something by three on a four-function calculator, she got results that ended in .3333333 or .6666667. “That’s not good enough! I want the exact answer!” she would say. |
|||
« Next Oldest | Next Newest »
|
User(s) browsing this thread: 4 Guest(s)