Post Reply 
Rounding Error Worse in Binary or Decimal Representation?
01-28-2015, 01:51 PM
Post: #1
Rounding Error Worse in Binary or Decimal Representation?
This thread

http://www.hpmuseum.org/forum/thread-2583.html

raises the question of reducing rounding errors by using binary instead of decimal representation or vice versa.

My view is that the as the number of primes included in the factorization of the number base increases, rounding errors will be reduced or avoided.

Accordingly, 2*5 as base should perform better than 2.

Both would be majorized by 2*3*5, & so on.

Or am I way off course?
Find all posts by this user
Quote this message in a reply
Post Reply 


Messages In This Thread
Rounding Error Worse in Binary or Decimal Representation? - Gerald H - 01-28-2015 01:51 PM



User(s) browsing this thread: 1 Guest(s)