Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Collecting/loading/processing batch from data with different resolution #8204

Open
Masaaki-75 opened this issue Nov 13, 2024 · 3 comments
Open

Comments

@Masaaki-75
Copy link

Is your feature request related to a problem? Please describe.

I have collected a bunch of CT volumes, each is stored in .nii.gz format and is of different resolutions (in depth, height and width). Due to the different resolutions, when i was trying to load them using pytorch's default DataLoader, i had to set the batch size to 1, which can be inefficient and cannot fully utilize the GPU memory.

Describe the solution you'd like

For efficient network training, i am expecting a certain class of Dataset/DataLoader that can load several volumes (with different resolutions) as a batch efficiently, and (randomly) crop them into the same-sized sub-volumes to feed in the network.

Are there any designs in MONAI that can address this?

@KumoLiu
Copy link
Contributor

KumoLiu commented Nov 14, 2024

Hi @Masaaki-75, thanks for your interest here.
To load and preprocess data of any size, you can use LoadImage. After loading, before batching, you can apply ResizeWithPadOrCrop to make sure all images have the same dimensions.
Alternatively, you can use pad_list_data_collate in DataLoader to automatically pad each item to match the shape of the largest tensor in each dimension. Here’s an example of how to use it:
https://github.com/Project-MONAI/tutorials/blob/2db8c620ae7e9e8f28cb2c7abb9e6cc75e3bc5c1/modules/interpretability/cats_and_dogs.ipynb#L98

Hope it helps, thanks.

@Masaaki-75
Copy link
Author

I see. Just tried it on my training pipeline and it worked! Better yet, it seems to be compatible with other popular pytorch libraries like accelerate. Thank you!

@Masaaki-75
Copy link
Author

Hi @KumoLiu , i am considering an advanced feature for data loading and processing: resizing (by cropping/padding/interpolation/..., whatever) the batch to different resolutions in each iteration.

For example, given two preset resolutions like [128, 256, 256] and [256, 512, 512], each batch will be resized to one of these resolutions randomly. The main idea is for data augmentation and for forcing the model to adapt to inputs with varying resolution.

Are there any similar cases in MONAI tutorials or existing solutions to this problem? Thanks!

@Masaaki-75 Masaaki-75 reopened this Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants