Re: Loading portion of large HDF5 array?
- To: mathgroup at smc.vnet.net
- Subject: [mg114113] Re: Loading portion of large HDF5 array?
- From: Oliver Ruebenkoenig <ruebenko at wolfram.com>
- Date: Wed, 24 Nov 2010 06:57:55 -0500 (EST)
Paul,
On Tue, 23 Nov 2010, Paul wrote:
> On Nov 22, 4:40 am, Bill Rowe <readn... at sbcglobal.net> wrote:
>> On 11/20/10 at 6:27 PM, pnort... at gmail.com (Paul) wrote:
>>
>>> I have a large matrix (>10gb) in an HDF5 file.
>>> Is there a way to read only a portion of this matrix using Import[]
>>> and the HDF5 import format?
>>
>> Yes. You can read various portions of the file. See
>>
>> ref/format/HDF5
>>
>> in the DocumentCenter for details
>
> A specific example is below (snipped output from h5ls -vlr) with
> matrix '/data' with dimensions ~ {10^9, 51}. How would I read in the
> first 1000 rows, the next 1000? Thanks for the documentation pointer
> but I didn't find any way to do this. I understand you can load in
> datasets separately but maybe not a portion of a single dataset.
>
> Import["file.h5', {"Datasets", "/data"}] attempts to load the full
> matrix.
>
> /data Dataset {110945492/Inf, 51/51}
> Location: 1:800
> Links: 1
> Chunks: {1000, 51} 204000 bytes
> Storage: 1158043888 logical bytes, 3840201966 allocated bytes,
> 107.67% utilization
> Filter-0: deflate-1 OPT {4}
> Type: IEEE 32-bit little-endian float
>
>
unfortunately this does not work out of the box right now. There is a
suggestion for this to happen in a future version. Until then, you'd have
to read it manually, unfortunately.
Oliver