Skip to content
This repository has been archived by the owner on Oct 19, 2024. It is now read-only.

Fix for OOM during inference #37

Merged
merged 1 commit into from
Dec 8, 2023
Merged

Fix for OOM during inference #37

merged 1 commit into from
Dec 8, 2023

Conversation

AjeyPaiK
Copy link
Member

@AjeyPaiK AjeyPaiK commented Dec 8, 2023

Pytorch Lightning keeps track of all its outputs through one epoch. That is why, we observed the OOM error during the predict loop. If we disable the return_predictions flag, OOM errors don't occur anymore. For a detailed discussion on this, refer the comments on #19.

@jonasteuwen jonasteuwen changed the title Fixes {#19} Fix for OOM during inference Dec 8, 2023
@jonasteuwen jonasteuwen self-requested a review December 8, 2023 16:43
Copy link
Contributor

@jonasteuwen jonasteuwen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@jonasteuwen jonasteuwen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jonasteuwen jonasteuwen merged commit b6315f5 into main Dec 8, 2023
2 checks passed
@jonasteuwen jonasteuwen deleted the fix-oom-error branch December 8, 2023 16:45
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants