Help understanding Accuracy[]
- To: mathgroup at smc.vnet.net
- Subject: [mg44094] Help understanding Accuracy[]
- From: terryisnow at yahoo.com (Terry)
- Date: Wed, 22 Oct 2003 03:24:56 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
Hi, I'm hoping someone will be able to shed some light on why the accuracy of the following large approximate real number actually increases as the number decreases in size, even though the number of digits to the right of the decimal point remains the same,... In[1]= x = 1234567890123456. Out[1]= 1.234567890123456 x 10^15 In[2]= Accuracy[x] Out[2]= 1 In[3]= x = 123456789012345. Out[3]= 1.23456789012345 x 10^14 In[4]= Accuracy[x] Out[4]= 2 Also, does anyone know why the accuracy drops to 0 when the first digit of the above number changes from 1 to 3? In[5]= x = 3234567890123456. Out[5]= 3.234567890123456 x 10^15 In[6]= Accuracy[x] Out[6]= 0 Seeing as once again the number of digits to the right of the decimal remains the same, I don't see why the accuracy should change, after all isn't that the definition of accuracy in Mathematica? Terry