Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ScanningWindowInferer accumulates gradient data and inference fails with out of memory #1420

Closed
aihsani opened this issue Jan 8, 2021 · 1 comment · Fixed by #1425
Closed
Labels
enhancement New feature or request

Comments

@aihsani
Copy link

aihsani commented Jan 8, 2021

Bug Description
The documentation does not explain that the MONAI inferers accumulate backpropagation information with each additional batch. The name "inferer" inherently implies no gradient information should be stored.

This took a few hours of detailed debugging to find the root cause, and creates a difficult user experience.

To Reproduce
Write a simple inference program using ScanningWindowInferer where the inference is performed on GPU. On CPU gradient info accumulation drastically slows down inference speed.

Expected behavior
The documentation should state that the inferers should be wrapped with torch.no_grad(), or the current inferer classes should be renamed with ScanningWindowForwardPropagator and wrapped with ScanningWindowInferer which internally uses torch.no_grad().

@wyli wyli added the enhancement New feature or request label Jan 8, 2021
@Nic-Ma
Copy link
Contributor

Nic-Ma commented Jan 10, 2021

Hi @aihsani ,

Thanks for your feedback, I will enhance the document to make it more clear.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants