Help understanding Accuracy
- To: mathgroup at smc.vnet.net
- Subject: [mg44094] Help understanding Accuracy
- From: terryisnow at yahoo.com (Terry)
- Date: Wed, 22 Oct 2003 03:24:56 -0400 (EDT)
- Sender: owner-wri-mathgroup at wolfram.com
I'm hoping someone will be able to shed some light
on why the accuracy of the following large
approximate real number actually increases as the
number decreases in size, even though the number
of digits to the right of the decimal point remains
In= x = 1234567890123456.
Out= 1.234567890123456 x 10^15
In= x = 123456789012345.
Out= 1.23456789012345 x 10^14
Also, does anyone know why the accuracy
drops to 0 when the first digit of the
above number changes from 1 to 3?
In= x = 3234567890123456.
Out= 3.234567890123456 x 10^15
Seeing as once again the number of digits
to the right of the decimal remains the same,
I don't see why the accuracy should change,
after all isn't that the definition of
accuracy in Mathematica?
Prev by Date:
Abort in 5.0
Next by Date:
Re: Can someone tell me why NDsolve isn't working here?
Previous by thread:
Re: Abort in 5.0
Next by thread: