Releases: constantinpape/z5
Critical bug-fixes
This release fixes two critical bugs:
- Artifacts for empty blocks (thanks for reporting @aschampion)
- Gzip compression for zarr
In addition, several issues and inconsistencies for []
of Dataset
were fixed (thanks to @clbarnes who implemented most of this) and the behavior of require_dataset
and visititems
was improved.
This will be the last release supporting python 2.7.
Fix multi-threading bug
This release fixes a critical bug in multi-threaded chunk I/O, which led to occasional seg-faults due to a thread-safety issue.
You can use multi-threaded I/O from python by setting the n_threads
attribute of a dataset:
ds.n_threads = 8
Also adds support for vector instructions via xsimd.
Support ROI in converters
Supports Region of Interests (ROI) in the hdf5 converters and utility functions.
All Changes:
- Change positional order of
z5py.File
constructor arguments s.t.mode
can be passed as second positional argument (this makes it closer toh5py
API). - Rename
z5py.util.rechunk
toz5py.util.copy_dataset
- Rename
z5py.util.copy_dataset
,z5py.converter.convert_to_n5
,z5py.converter.convert_from_h5
argumentsout_chunks
tochunks
andout_blocks
toblock_shape
;chunks
now has default valueNone
. In this case the chunk shape of input dataset is used. z5py.util.copy_dataset
,z5py.converter.convert_to_n5
,z5py.converter.convert_from_h5
now support argumentsroi
andfit_to_roi
.roi
is optional slice or tuple of slices which restricts operation to region of interest defined by the slice(s). If 'fit_to_roi' isTrue
shape of output dataset is set toroi
shape ifroi
is given (default isFalse
).
More h5py compatibility
Introduce more compatibility with h5py
:
- Implement
read_direct
/write_direct
, thanks to @paulhfu - Add
compression
attribute toDataset
and renamecompression_options
tocompression_opts
Additional changes:
- Add unix environment files for python 2.7, 3.5, 3.6, 3.7 and use for travis build, thanks to @clbarnes
- Fix issue in c++ implementation of blocking
- Remove json submodule
Support N5 varlength
Varlength chunks store chunk data with arbitrary number of elements.
In z5py, varlength chunks cannot be read/written by []
.
Instead, this release also exposes access to chunks via
Dataset.write_chunk
/ Dataset.read_chunk
.
Chunks with variable length can be written via:
ds.write_chunk(chunk_id, var_len_data, varlen=True)
and chunks in varlength mode will be correctly read by read_chunk
.
Note that the zarr format does not support varlength chunks.
Fix Conda build
Custom json, visititems, tif converter
This release includes:
- support for custom json serializers via
z5py.set_json_encoder
/z5py.set_json_decoder
- converters for tif images using
imageio
:z5py.converter.convert_from_tif
visititems
functionality inz5py.Group
; thanks to @paulhfu for this contribution.
First Major Release
First major release:
- N5 specification on filesystem is supported except for var-size chunks.
- Zarr specification on filesystem is suported except for F-order and variable endianess.
- Python API is mostly compatible with
h5py
and should be stable. - Conda-forge packages for relevant systems and python versions.
Update build
0.5.5 Update build
Fix build issues
0.5.4 Update dataToFormat implementation