Re: A strange precision behavior?
- To: mathgroup at smc.vnet.net
- Subject: [mg90395] Re: A strange precision behavior?
- From: Bill Rowe <readnews at sbcglobal.net>
- Date: Tue, 8 Jul 2008 02:26:48 -0400 (EDT)
On 7/7/08 at 5:07 AM, aaronfude at gmail.com (Aaron Fude) wrote: >The following two commands result in different values: >SetPrecision[0.01, 50] 0.01`50 That is because these are different numbers. >Is that because in the first case, 0.01 is first represented as a >double and then passed to SetPrecision, while in the second case, >Mathematica gets a chance to interprete 0.01 as 1/100? Mathematica never interprets 0.01 as 1/100. The value 0.01 is a machine precision number. If you want to get the equivalent of 0.01`50 using SetPrecision you should use SetPrecision[1/100, 50] as In[1]:= SetPrecision[1/100, 50] === .01`50 Out[1]= True shows Doing SetPrecision[0.01, 50] is telling Mathematica to start with a machine precision number and increase the precision to 50. You can't do this in any consistent manner. In general, the difference between the closest machine number entered with a decimal point and the exact rational value will be greater than that between a number with a precision of 50 and the exact rational value. It is never a good idea to attempt to increase the precision of a lower precision number.