You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I just started playing with pytorch-lightning API to decide whether to make the switch for my own speech processing project.
I have an infinite DataLoader that wraps an IterableDataset that basically extracts random audio chunks from a large speech corpus, forever. As such, it has no __len__ and never raises StopIteration exception.
Therefore, calling trainer.fit(...) stays at Epoch 1 forever as the batch number keeps increasing.
I am looking for a way to tell trainer.fit(...) that it should consider that an epoch is completed every N batches. Is there any?