Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concatenate two nii.gz and correlate with shape.gii? #94

Closed
1 task done
rtang2100 opened this issue Feb 28, 2023 · 4 comments
Closed
1 task done

Concatenate two nii.gz and correlate with shape.gii? #94

rtang2100 opened this issue Feb 28, 2023 · 4 comments

Comments

@rtang2100
Copy link

Description of issue

This is probably a very simple question but I can't seem to find the appropriate commands.
I have two nii.gz files both in fsaverage space (left hemi and right hemi). I have a shape. gii file created from an array that's in the same fsaverage space.

I used the following command:
LH=file path
RH file path
gaba=LH,RH

corr = compare_images(gaba, test, metric='pearsonr')

Traceback (most recent call last):
File "", line 1, in
File "/home/rtang/.local/lib/python3.8/site-packages/neuromaps/stats.py", line 68, in compare_images
mask = np.logical_or(np.isclose(srcdata, 0), np.isclose(trgdata, 0))
ValueError: operands could not be broadcast together with shapes (163842,2) (327684,)

Based on the error, my concatenation probably failed. I was wondering if there's a way to fix this!

Thanks!
Catherine

Code of Conduct

  • I agree to follow the neuromaps Code of Conduct
@VinceBaz
Copy link
Member

VinceBaz commented Mar 1, 2023

Hi Catherine,

It does seem like the concatenation of gaba failed. Can you tell me a little bit more about your LH and RH images? If they are nii.gz files (i.e. nifti images), then this might be your problem: images in fsaverage space should be saved as .gii images.

Best,
Vincent

@rtang2100
Copy link
Author

Hi Vincent,

Thanks for getting back! Yes, the LH and RH images are nii.gz images in fsaverage space, one for each hemisphere. Do you happen to know any functions within or outside of neuromaps that could convert nii.gz to gii? I grabbed these surface-based atlases from https://xtra.nru.dk/FS5ht-atlas/, as I want to avoid transforming PET images from volumetric space to surface space giving the PVE (based on a prior thread - #81). I assume if these nii.gz files are already in fsaverage surface space, they should work with neuromaps..?

Thanks!
Catherine

@VinceBaz
Copy link
Member

VinceBaz commented Mar 3, 2023

When your source/target maps are .nii.gz files, stats.compare_images assumes that the data is volumetric and therefore that the two hemispheres are two separate bilateral images. So this is why the function does not work: it loads the data as a (163842,2) array rather than as a (327684,) array.

By converting the nii.gz image to a .gii image, you can avoid these ambiguities. As it is quite uncommon for surface images to be stored in nii.gz files, I am not aware of a function that would do it automatically for you. But, what you could do is load the nifti image and retrieve the data using nibabel and generate a .shape.gii image using a function from neuromaps called images.construct_shape_gii. So it would basically look like this:

import nibabel as nib
from neuromaps import images

LH_data = nib.load(LH).get_get_fdata().flatten()
LH_gii = images.construct_shape_gii(LH_data)

RH_data = nib.load(RH).get_get_fdata().flatten()
RH_gii = images.construct_shape_gii(RH_data)

For this specific case, however, it is worth noting that the receptor maps in fsaverage space that you downloaded from https://xtra.nru.dk/FS5ht-atlas/ are already available, as gifti images, in neuromaps. So you could simply fetch your maps from neuromaps. You can look at the annotation_info sheet, available in the repo's wiki, to know which tracer is associated to which receptor. For instance, to get the 5HT1a density map, in fsaverage space, you could do:

from neuromaps import datasets

LH_gii, RH_gii = fetch_annotation(source='beliveau2017', desc='cumi101', space='fsaverage', den='164k')

It's important to note that for these maps, the left- and right-hemisphere files were originally inverted (the first one was the right-hemisphere, and the second one was the left-hemisphere). We fixed this bug in a recent pull request. So you might want to use the current version of neuromaps for this, which you'll have to download directly from github.

Let me know if you have any question!

Best,
Vincent

@rtang2100
Copy link
Author

Hi Vincent,

Thanks a lot for these great info! I will re-download neuromaps for this.

Best,
Catherine

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants