A strange precision behavior?
- To: mathgroup at smc.vnet.net
- Subject: [mg90360] A strange precision behavior?
- From: Aaron Fude <aaronfude at gmail.com>
- Date: Mon, 7 Jul 2008 05:07:02 -0400 (EDT)
The following two commands result in different values: SetPrecision[0.01, 50] 0.01`50 Is that because in the first case, 0.01 is first represented as a double and then passed to SetPrecision, while in the second case, Mathematica gets a chance to interprete 0.01 as 1/100? Thanks! Aaron