Skip to content

How does one define an "epoch" when the data loader is infinite? #2150

@hbredin

Description

@hbredin

I just started playing with pytorch-lightning API to decide whether to make the switch for my own speech processing project.

I have an infinite DataLoader that wraps an IterableDataset that basically extracts random audio chunks from a large speech corpus, forever. As such, it has no __len__ and never raises StopIteration exception.

Therefore, calling trainer.fit(...) stays at Epoch 1 forever as the batch number keeps increasing.

I am looking for a way to tell trainer.fit(...) that it should consider that an epoch is completed every N batches. Is there any?

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions