Mathematica 9 is now available
Services & Resources / Wolfram Forums
-----
 /
MathGroup Archive
2005
*January
*February
*March
*April
*May
*June
*July
*August
*September
*October
*November
*December
*Archive Index
*Ask about this page
*Print this page
*Give us feedback
*Sign up for the Wolfram Insider

MathGroup Archive 2005

[Date Index] [Thread Index] [Author Index]

Search the Archive

Reducing memory for big data sets

  • To: mathgroup at smc.vnet.net
  • Subject: [mg57891] Reducing memory for big data sets
  • From: Maximilian Ulbrich <mulbrich at berkeley.edu>
  • Date: Sat, 11 Jun 2005 03:35:59 -0400 (EDT)
  • Organization: University of California, Berkeley
  • Sender: owner-wri-mathgroup at wolfram.com

Hi,

I have a big data set of 128*128*1000 points to evaluate (a series of 
images, and I plan to use 256*256*1000 as well...).
When I read the data into Mathematica from text files, MemoryInUse[] 
says about 300MB are used. However, my images consist of only about 
16.000.000 data points of 1 byte or 2 bytes for other data sets.

Is it possible to make Mathematica reduce the used memory by storing the 
data in another format maybe?

Thanks,
Max


  • Prev by Date: Applying function to nested lists
  • Next by Date: Re: moving dingbats
  • Previous by thread: Re: Applying function to nested lists
  • Next by thread: Portable Notebooks and Filenames[]