Services & Resources / Wolfram Forums
-----
 /
MathGroup Archive
2005
*January
*February
*March
*April
*May
*June
*July
*August
*September
*October
*November
*December
*Archive Index
*Ask about this page
*Print this page
*Give us feedback
*Sign up for the Wolfram Insider

MathGroup Archive 2005

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Reducing memory for big data sets

  • To: mathgroup at smc.vnet.net
  • Subject: [mg57899] Re: Reducing memory for big data sets
  • From: "Jens-Peer Kuska" <kuska at informatik.uni-leipzig.de>
  • Date: Sun, 12 Jun 2005 04:34:21 -0400 (EDT)
  • Organization: Uni Leipzig
  • References: <d8e5di$gr8$1@smc.vnet.net>
  • Sender: owner-wri-mathgroup at wolfram.com

Hi,

no, Mathematica use (at least) 32 bit integers and 
there is no
data type for bytes, short int or what ever.

Regards
  Jens

"Maximilian Ulbrich" <mulbrich at berkeley.edu> 
schrieb im Newsbeitrag 
news:d8e5di$gr8$1 at smc.vnet.net...
> Hi,
>
> I have a big data set of 128*128*1000 points to 
> evaluate (a series of
> images, and I plan to use 256*256*1000 as 
> well...).
> When I read the data into Mathematica from text 
> files, MemoryInUse[]
> says about 300MB are used. However, my images 
> consist of only about
> 16.000.000 data points of 1 byte or 2 bytes for 
> other data sets.
>
> Is it possible to make Mathematica reduce the 
> used memory by storing the
> data in another format maybe?
>
> Thanks,
> Max
> 



  • Prev by Date: Re: "expanding a list"
  • Next by Date: Help with the User Interface in Mathematica 5.1.1
  • Previous by thread: Re: Applying function to nested lists
  • Next by thread: Help with the User Interface in Mathematica 5.1.1