Replies: 1 comment
-
@egoriyaa of course, you can chunk your inference dataset in batches, and pass them to the model one batch at a time. We don't do that in the package since it's something that users can easily do around the Autogluon (see tutorial) does this out of the box, if you're interested: you pass in a dataset, and it will take care of loading it in batches, see these lines. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi!
As I correctly understand, if dataset contains a lot of time series and inference is doing on gpu, model can fail due to OOM.
May be it will be better to pass "batch" of time series to model.
Beta Was this translation helpful? Give feedback.
All reactions