MathGroup Archive 2008

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: memory question

  • To: mathgroup at smc.vnet.net
  • Subject: [mg91234] Re: memory question
  • From: Bill Rowe <readnews at sbcglobal.net>
  • Date: Sun, 10 Aug 2008 01:54:25 -0400 (EDT)

On 8/9/08 at 7:45 AM, jasch at crustal.ucsb.edu (Jan Schmedes) wrote:

>I am running Mathematica 6.0.3 on a Mac Pro with 4 Gig memory. I
>have fortran binary files that I can read in correctly (numbers of
>order 10^6) using Import[...,"Real32"]. But for larger problems, the
>size of the binary file is about 80 Mb, all memory is used (if i hit
>top, VSIZE is 17G)  and only 2% of the CPU are used, so it basically
>can not handle the file. Is there a way to decrease the precision at
>which the file is read in, for my purpose the precision does not
>have to be very high?

Even if there is a simple way to decrease the precision as you
read the data in this will not solve your problem.

I am sure you will find minimum storage of floating point
numbers occurs when you use machine precision which on your
machine I believe will be 64 bits. Using Mathematica's arbitrary
precision technology will increase storage requirements since
there is additional information that needs to be stored.

The only real solution to very large problems is to read part of
the data in, process that part and repeat until you've processed
all of the data. In this way, the limit becomes hard drive
storage rather than RAM. Of course the total time to process the
data will increase due to overhead associated with managing
portions of the data and I/O speed of the hard drive. But this
might be all that bad as a custom routine to read in the data
might perform more efficiently than Import.


  • Prev by Date: Re: Running Notebooks inside a notebook:
  • Next by Date: Re: Re: Exported PDFs are too big in file size.
  • Previous by thread: memory question
  • Next by thread: Re: memory question