-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Closed
Description
I have a xarray Dataset created using dask, which i would like to write on my disk.
The size of this dataset is 12GB, and my computer has 64GB of memory, but when I run the 'to_netcdf' command, my memory crashes.
type(nc)
Out[5]: xray.core.dataset.Dataset
nc.chunks
Out[6]: Frozen(SortedKeysDict({'Denree': (19,), u'NIsoSource': (10, 10, 10, 10, 10, 6), 'Pop': (6,),
u'DimJ0': (20, 17), u'DimK0': (1,), u'time': (50, 50, 50, 50, 3), u'DimI0': (15,), 'TypeDose': (2,)}))
nc.nbytes * (2 ** -30)
Out[7]: 12.4569926I don't understand what i'm doing wrong, so thanks for your help.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels