MathGroup Archive 2005

[Date Index] [Thread Index] [Author Index]

Search the Archive

Reducing memory for big data sets


I have a big data set of 128*128*1000 points to evaluate (a series of 
images, and I plan to use 256*256*1000 as well...).
When I read the data into Mathematica from text files, MemoryInUse[] 
says about 300MB are used. However, my images consist of only about 
16.000.000 data points of 1 byte or 2 bytes for other data sets.

Is it possible to make Mathematica reduce the used memory by storing the 
data in another format maybe?


  • Prev by Date: Applying function to nested lists
  • Next by Date: Re: moving dingbats
  • Previous by thread: Re: Applying function to nested lists
  • Next by thread: Portable Notebooks and Filenames[]