MathGroup Archive 2004

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Excessive Mathematica memory use, revisited.

  • To: mathgroup at
  • Subject: [mg46883] Re: Excessive Mathematica memory use, revisited.
  • From: "Mariusz Jankowski" <mjankowski at>
  • Date: Fri, 12 Mar 2004 23:39:35 -0500 (EST)
  • Organization: University of Southern Maine
  • References: <c2rnjr$p13$>
  • Sender: owner-wri-mathgroup at

Vincent, here are a couple of observations:

1) BinaryImport returns data in packed form so no need to use
2) Mathematica will need  ~ 1 GB to hold your 500 MB image (multiply number
of samples by 4)

this was the good news, because you seem to have enough RAM. The bad news is
that after running a few Import tests using BinaryImport I found that
Mathematica needs approx 4 times the final ByteCount of your raw data during
processing of Import. So you may indeed be running out of virtual memory
space on your computer. Note also that half of the peak memory needed is
returned once Import completes.

Use functions ByteCount, MaxmemoryUsed and MemoryInUse to verify these

Hope this helps, Mariusz

>>> Virgilio, Vincent<Vincent.Virgilio at> 3/12/2004 2:07:39 AM >>>


First, thanks to Jens-Peer Kuska for responding to this issue back in

I am now working with a much more powerful machine: P4 2.6 GHz, 2GB RAM,
Windows XP, Mathematica 5.0.0.

My goal is still to manipulate very large images. Currently I am trying
to use Experimental`BinaryImport to import an ~ 500MB image into
Mathematica. Each sample is an unsigned 16-bit integer.

I start the Windows Task Manager, display the process list, and sort by
memory usage.

Then, after issuing the following commands to a cold kernel, I watch the
Mathematica kernel rise to the top of the above list, consume chunks of
memory in steps of maybe 20MB, and exhaust physical memory.








There is memory exhaustion with or without ToPackedArray[].

I could understand this behaviour if Mathematica is storing each sample
in a node of some sort of tree. Then the data volume would be multiplied
by some factor determined by the tree node size. If this is the case, is
there a way to avoid this, during import, with packed arrays?

Is there anything obviously wrong or inefficient in the above code? I've
tried to incorporte Jens' suggestions. One of his was to use Hold[]
wherever possible; I can't see where it would be productive to do that

Note, I see similar behaviour on a smaller scale with much smaller
images (14MB, per earlier mail). Which is not a problem since the memory
ceiling has gone up considerably.

Perhaps 2GB is simply inadequate for this task, since I don't expect
Mathematica to use a data tile cache. Other products do, but then they
target a much more specific application domain.

Regards and thanks,

Vince Virgilio

This email and any files transmitted with it are proprietary and intended
solely for the use of the individual or entity to whom they are addressed.
If you have received this email
in error please notify the sender. Please note that any views or opinions
presented in this email are solely those of the author and do not
necessarily represent those of ITT Industries, Inc.
The recipient should check this email and any attachments for the presence
of viruses. ITT Industries accepts no liability for any damage caused by any
virus transmitted by this

  • Prev by Date: Re: Creating a symmetric matrix
  • Next by Date: Re: Excessive Mathematica memory use, revisited.
  • Previous by thread: Excessive Mathematica memory use, revisited.
  • Next by thread: Re: Excessive Mathematica memory use, revisited.