You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an abstract loader that chains multiple pytorch loaders when training on video sequences. Each small loader contains one sequence and the chained loader just use itertools.chain to chain them together.
I cannot put all data in a single loader because it does not make sense to read images from two different sequences.
I cannot use an IterableDataset because that would still have the same problem.
This assumes that a loader must have a dataset field, which I felt is too restrictive. I suggest putting a check to see if the loader has a dataset member before doing this check. My workaround, for now, is just to declare a dataset member in my ChainedLoader to be None.
Fix
Since this kind of abstract loader does not have an explicit handle to a dataset, it should belong to the concept of an IterableDataset, which the user specifies the number of batches rather than a percentage. So a simple fix would be to put a check of hasattr somewhere in the same line.
The text was updated successfully, but these errors were encountered:
Feature / Bug (?)
I have an abstract loader that chains multiple pytorch loaders when training on video sequences. Each small loader contains one sequence and the chained loader just use itertools.chain to chain them together.
I cannot put all data in a single loader because it does not make sense to read images from two different sequences.
I cannot use an IterableDataset because that would still have the same problem.
https://github.com/PyTorchLightning/pytorch-lightning/blob/06242c200a318a37d1f882c786e60354ec04533f/pytorch_lightning/trainer/data_loading.py#L58
This assumes that a loader must have a dataset field, which I felt is too restrictive. I suggest putting a check to see if the loader has a dataset member before doing this check. My workaround, for now, is just to declare a dataset member in my ChainedLoader to be None.
Fix
Since this kind of abstract loader does not have an explicit handle to a dataset, it should belong to the concept of an IterableDataset, which the user specifies the number of batches rather than a percentage. So a simple fix would be to put a check of
hasattr
somewhere in the same line.The text was updated successfully, but these errors were encountered: