You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using TotalSegmentator for multiple inferences and noticed that the weights are reloaded at each inference. This significantly increases the runtime and memory usage.
Would there be an option to keep the weights in memory between inferences to optimize performance?
Thank you for your attention to this request.
Best regards
The text was updated successfully, but these errors were encountered:
Thanks for the update. I understand the challenge with refactoring.
Also, on another note, regarding dropout: would it be possible to introduce it easily into the project to obtain a distribution of segmentations? If so, I'd be happy to work on it and propose a PR.
Hello,
I am using TotalSegmentator for multiple inferences and noticed that the weights are reloaded at each inference. This significantly increases the runtime and memory usage.
Would there be an option to keep the weights in memory between inferences to optimize performance?
Thank you for your attention to this request.
Best regards
The text was updated successfully, but these errors were encountered: