Re: Memory Blowup Issues

*To*: mathgroup at smc.vnet.net*Subject*: [mg126725] Re: Memory Blowup Issues*From*: Szabolcs <szhorvat at gmail.com>*Date*: Sat, 2 Jun 2012 05:45:37 -0400 (EDT)*Delivered-to*: l-mathgroup@mail-archive0.wolfram.com*References*: <jqa1h7$bj7$1@smc.vnet.net>

On Friday, 1 June 2012 11:20:39 UTC+2, Kevin J. McCann wrote: > I am dealing with some good sized data sets from satellite data. These > data are stored in hdf format. The problem is that I process a sequence > of these data sets by Importing and then processing; however, it seems > that the memory usage grows with each Import, even if I Remove the data. > > Here is a somewhat simplified example: > > MemoryInUse[] (* some data already read in *) > 2781549920 > > Data=Import[fileName,"Data"]; (* import a new data set *) > MemoryInUse[] > 3302168936 > > Data=Import[fileName,"Data"];(* import it again *) > MemoryInUse[] > 3822781920 > > Remove[Data] (* this has no effect on the memory usage *) > MemoryInUse[] > 3822787840 > > As you can see, I will shortly run out of memory. Any ideas? > Hello Kevin, The most likely cause of this behaviour is that the intermediate results are being remembered by Mathematica through the Out symbol. You probably know that you can refer to previous results using %, %%, %%% or Out[10], Out[11], etc. but you may not have realized that Out will get assigned a new value even if the last input ended in a semicolon. Example: In[1]:= data=RandomReal[1,10000]; (* output suppressed using semicolon *) In[2]:= Length[%] (* note that % will have the same value as data *) The solution is to set $HistoryLength = 0 before you start working, to prevent the system from remembering previous results. Please see these articles as well: http://mathematica.stackexchange.com/questions/3376/old-values-are-not-freed-garbage-collected-when-you-re-evaluate-an-assignment http://mathematica.stackexchange.com/questions/4322/memory-not-freed-after-running-clear-when-using-a-table Szabolcs