Services & Resources / Wolfram Forums / MathGroup Archive
-----

MathGroup Archive 2011

[Date Index] [Thread Index] [Author Index]

Search the Archive

Re: Error on importing previously saved data file: Get::bigfile

  • To: mathgroup at smc.vnet.net
  • Subject: [mg122400] Re: Error on importing previously saved data file: Get::bigfile
  • From: Leonid Shifrin <lshifr at gmail.com>
  • Date: Thu, 27 Oct 2011 06:32:34 -0400 (EDT)
  • Delivered-to: l-mathgroup@mail-archive0.wolfram.com
  • References: <201110262136.RAA29918@smc.vnet.net>

Matthias,

On the page I linked, I provided 2 implementations - for general tables
(first), and
for sparse tables. The first one should be as general as Import[...,
"Table"] itself,
although I did not test extensively.

Cheers,
Leonid


On Thu, Oct 27, 2011 at 10:03 AM, Matthias Seise <mseise at gmx.de> wrote:

> Dear Leonid,
>
> thanks for that link. I did see it before but it seems to work only with
> "normal" tables [only numbers]. I do have image data as well in the list and
> would like to save and load them. Of course I could convert them to a list
> on save and back on loading but It seems quite a lot of trouble for a simple
> read/write task.
>
> BR
> Matthias
>
>
> On 27 October 2011 04:20, Leonid Shifrin <lshifr at gmail.com> wrote:
>
>> Matthias,
>>
>> Check out my implementation of an alternative to Import for "Table" format
>> - it lives
>> here:
>>
>>
>> http://stackoverflow.com/questions/7525782/import-big-files-arrays-with-mathematica/7527064#7527064
>>
>> It may be able to solve your problem.
>>
>> Regards,
>> Leonid
>>
>>
>> On Thu, Oct 27, 2011 at 1:36 AM, Matthias Seise <mseise at gmail.com> wrote:
>>
>>> I'm working in Mathematica 8, 64 bit.
>>>
>>> I have saved a data set (Save["filename",data]) - which is about 3.5
>>> MB on the harddisk (a table with numbers and images,
>>> ByteCount=610117226).
>>> When I try to read it again into Memory I get "Get::bigfile: "The file
>>> "filename" is too large to be read into memory."
>>>
>>> I can work around it by splitting the data in smaller junks, but there
>>> should be a better way since the data set should not be too large
>>>
>>> Any ideas?
>>>
>>>
>>> Best regards,
>>> Matthias Seise
>>>
>>>
>>
>


  • Prev by Date: Re: Using values from inside a matrix for a new row
  • Next by Date: Re: simple question about removing elements from a table
  • Previous by thread: Re: Error on importing previously saved data file: Get::bigfile
  • Next by thread: Re: Error on importing previously saved data file: Get::bigfile