Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open Zarr File on M1 #548

Open
jburel opened this issue Oct 5, 2022 · 10 comments
Open

Open Zarr File on M1 #548

jburel opened this issue Oct 5, 2022 · 10 comments

Comments

@jburel
Copy link

jburel commented Oct 5, 2022

  • Create a conda environment with the required dependencies
  • Run the example below
from zarr.storage import FSStore
from itkwidgets import view
import zarr
fsstore = FSStore('https://dandiarchive.s3.amazonaws.com/zarr/7723d02f-1f71-4553-a7b0-47bda1ae8b42')
brainstem = zarr.open_group(fsstore, mode='r')

view(brainstem)
  • The following message pops up.
    The git command requires the command line developer tools
    The tools is not installed.
@thewtex
Copy link
Member

thewtex commented Oct 10, 2022

Hi @jburel, thanks for the report!

Currently, itkwidgets 1.X pre-releases provide M1 support. The 1.X refactor is architected on top of NGFF-Zarr.

The example notebook needed a slight update to install fsspec[http] with zsh in #553 .

The 1.X pre-releases are currently only available on PyPI. However, please try this environment.yml, which worked on my M1:

name: itkwidgets-zarr-demo
channels:
  - conda-forge
dependencies:
  - python
  - jupyterlab
  - pip
  - fsspec[http]
  - pip:
    - itkwidgets[lab]>=1.0a15

@will-moore
Copy link

Hi @thewtex - thanks for that.
I tried with that environment.yml and the image at https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0062A/6001240.zarr
but didn't get anything to show up, using your demo at ome/ngff#139 (comment) as an example:

Screenshot 2022-10-13 at 13 46 24

Maybe OME-NGFF images aren't yet supported directly by itkwidgets?

@thewtex
Copy link
Member

thewtex commented Oct 13, 2022

Hi @will-moore , thanks for testing!

There is a tweak required in your case -- itkwidgets[lab] -> itkwidgets[notebook] for Jupyter Lab vs Jupyter Notebook.

Here is what results (note -- the data does take a while to load from the USA at least -- and we are working on better feedback in the viewer that data is loading).

Untitled_.Oct.13.2022.10_08.AM.webm

Maybe OME-NGFF images aren't yet supported directly by itkwidgets?

itkwidgets is built around OME-NGFF and will load it directly! 🎇 If the data is not OME-NGFF, it is dynamically generated on the fly. Details:

# NGFF Zarr
if isinstance(image, zarr.Group) and 'multiscales' in image.attrs:
return image.store
min_length = 64
if label:
method = Methods.DASK_IMAGE_NEAREST
else:
method = Methods.DASK_IMAGE_GAUSSIAN
store, chunk_store = _make_multiscale_store()
if HAVE_MULTISCALE_SPATIAL_IMAGE:
from multiscale_spatial_image import MultiscaleSpatialImage
if isinstance(image, MultiscaleSpatialImage):
image.to_zarr(store, compute=True)
return store
if isinstance(image, itkwasm.Image):
ngff_image = itk_image_to_ngff_image(image)
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
if HAVE_ITK:
import itk
if isinstance(image, itk.Image):
ngff_image = itk_image_to_ngff_image(image)
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
if HAVE_VTK:
import vtk
if isinstance(image, vtk.vtkImageData):
ngff_image = vtk_image_to_ngff_image(image)
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
if isinstance(image, dask.array.core.Array):
ngff_image = to_ngff_image(image)
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
if isinstance(image, zarr.Array):
ngff_image = to_ngff_image(image)
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
if HAVE_TORCH:
import torch
if isinstance(image, torch.Tensor):
ngff_image = to_ngff_image(image.numpy())
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
# Todo: preserve dask Array, if present, check if dims are NGFF -> use dims, coords
# Check if coords are uniform, if not, resample
if HAVE_XARRAY:
import xarray as xr
if isinstance(image, xr.DataArray):
# if HAVE_MULTISCALE_SPATIAL_IMAGE:
# from spatial_image import is_spatial_image
# if is_spatial_image(image):
# from multiscale_spatial_image import to_multiscale
# scale_factors = _spatial_image_scale_factors(image, min_length)
# multiscale = to_multiscale(image, scale_factors, method=method)
# return _make_multiscale_store(multiscale)
return xarray_data_array_to_numpy(image)
if isinstance(image, xr.Dataset):
# da = image[next(iter(image.variables.keys()))]
# if is_spatial_image(da):
# scale_factors = _spatial_image_scale_factors(da, min_length)
# multiscale = to_multiscale(da, scale_factors, method=method)
# return _make_multiscale_store(multiscale)
return xarray_data_set_to_numpy(image)
if isinstance(image, np.ndarray):
ngff_image = to_ngff_image(image)
multiscales = to_multiscales(ngff_image, method=method)
to_ngff_zarr(store, multiscales, chunk_store=chunk_store)
return store
raise RuntimeError("Could not process the viewer image")

@will-moore
Copy link

Thanks - that works great!

I also tried a v0.3 OME-NGFF as part of filling out the growing table at https://github.com/will-moore/ngff/blob/ngff_tools_table/ngff-tools.md

https://hms-dbmi.github.io/vizarr/?source=https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.3/idr0079A/9836998.zarr

I got some strange behaviour (don't know if this is v0.3-specific or not)..

To start with I got a low resolution view of the whole image, but as I was viewing it (just rotating and panning, no other settings changed) the higher resolution levels loaded but only showed a cropped part of the image...

Screenshot 2022-10-13 at 16 09 29

Screenshot 2022-10-13 at 16 10 21

Screenshot 2022-10-13 at 16 10 58

Screenshot 2022-10-13 at 16 12 03

Just another thought... I wonder if it would be possible to configure a page like with vizarr above so that I can open an image directly in itkwidgets without needing to run a notebook etc? It would make testing easier and we could then link from all our samples at https://idr.github.io/ome-ngff-samples/ directly to view in itkwidgets!

@PaulHax
Copy link
Collaborator

PaulHax commented Oct 13, 2022

Hi @will-moore

I'm helping with this project. Your right, not a great experience with datasets that the viewer does not find scale metadata for. Higher resolution images grow in size as each voxel is 1 unit and the camera does not reset to encompass the new resolution. Then it takes a while to build the portion of the image in view of the camera. Appreciate any advice you have.

I like your link idea. Satisfaction:
https://kitware.github.io/itk-vtk-viewer/app/?rotate=false&fileToLoad=https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0062A/6001240.zarr

@will-moore
Copy link

Thanks @PaulHax that's great. I opened a PR to add itk-vtk-viewer links to all the v0.4 Images in https://idr.github.io/ome-ngff-samples/

@jburel
Copy link
Author

jburel commented Oct 20, 2022

Sorry for not responding earlier, I was on annual leave.
I will adjust some of my notebooks cc @will-moore

@jburel
Copy link
Author

jburel commented Nov 1, 2022

@PaulHax @thewtex we are having our community meeting over the next few weeks [1]
We have sessions on the 10 of November going over the viewers currently opening zarr files. We will encourage people developing viewers to present their tools (~10mins) and discuss what's needed from the specification point of views.
Will you interested to join us? The meeting is free

[1] https://www.openmicroscopy.org/events/ome-community-meeting-2022/

@thewtex
Copy link
Member

thewtex commented Nov 1, 2022

@jburel that's fantastic! We are excited to participate!

@jburel
Copy link
Author

jburel commented Nov 2, 2022

Great. Registration closes on Friday 04/11. We will send zoom link and info to the participants

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants