       N[ ]

• To: mathgroup at yoda.ncsa.uiuc.edu
• Subject: N[ ]
• From: uunet!yoda.ncsa.uiuc.edu!keiper%wri (Jerry Keiper)
• Date: Fri, 23 Mar 90 17:33:09 CST

```I stand by my statement that N[ ] never raises the precision of a number.
But perhaps I need to clarify what I mean by that.  There are two kinds of
numbers in Mathematica: machine numbers and bignums.  If a x is a machine
number Precision[x] will always return whatever the precision of a double
is; on a Sun that happens to be 16.  Mathematica never keeps track of the
precision of any machine number so Precision[x] tells you very little if
x is a machine number; you won't even be able to verify that it is a machine
number rather than a bignum with a certain precision.  Furthermore, if
the second argument to N[ ] is less than or equal to the precision of a
machine number it produces a machine number and displays it with the
number of digits requested (terminating zeros are suppressed).

This leads us to the second question: How does it know how many digits to
display in the case of a machine number?  Most of the time it doesn't so
it just shows 6 as a default.  However if the number was either entered
from the keyboard or resulted from N[ ] with a second argument less than
or equal to the precision of a machine number then the number of digits to
be displayed (NOT the precision) is stored in an int field in the number.
As soon as you do anything other than add 0 or multiply by 1, since it is
a machine number and Mathematica doesn't bother to keep track of its
precision, the number of digits to be displayed is lost and it will display
with the 6 digit default.  In my opinion this convention is pretty much
useless and it certainly leads to a great deal of confusion.

Jerry B. Keiper
Wolfram Research Inc.
keiper at wri.com

```

• Prev by Date: evaluation
• Next by Date: precision and accuracy