MathGroup Archive 2012

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Memory Blowup Issues

  • To: mathgroup at
  • Subject: [mg126725] Re: Memory Blowup Issues
  • From: Szabolcs <szhorvat at>
  • Date: Sat, 2 Jun 2012 05:45:37 -0400 (EDT)
  • Delivered-to:
  • References: <jqa1h7$bj7$>

On Friday, 1 June 2012 11:20:39 UTC+2, Kevin J. McCann  wrote:
> I am dealing with some good sized data sets from satellite data. These
> data are stored in hdf format. The problem is that I process a sequence
> of these data sets by Importing and then processing; however, it seems
> that the memory usage grows with each Import, even if I Remove the data.
> Here is a somewhat simplified example:
> MemoryInUse[] (* some data already read in *)
> 2781549920
> Data=Import[fileName,"Data"]; (* import a new data set *)
> MemoryInUse[]
> 3302168936
> Data=Import[fileName,"Data"];(* import  it again *)
> MemoryInUse[]
> 3822781920
> Remove[Data] (* this has no effect on the memory usage *)
> MemoryInUse[]
> 3822787840
> As you can see, I will shortly run out of memory. Any ideas?

Hello Kevin,

The most likely cause of this behaviour is that the intermediate results are being remembered by Mathematica through the Out symbol.  You probably know that you can refer to previous results using %, %%, %%% or Out[10], Out[11], etc. but you may not have realized that Out will get assigned a new value even if the last input ended in a semicolon.


In[1]:= data=RandomReal[1,10000]; (* output suppressed using semicolon *)

In[2]:= Length[%] (* note that % will have the same value as data *)

The solution is to set $HistoryLength = 0 before you start working, to prevent the system from remembering previous results.

Please see these articles as well:


  • Prev by Date: Re: Sum of Products - Complete The Square
  • Next by Date: Re: Flattening a hypermatrix into an ordinary matrix
  • Previous by thread: Re: Memory Blowup Issues
  • Next by thread: Re: Memory Blowup Issues