Custom Batch Sampler #132
Replies: 2 comments 2 replies
-
Hello, Does that mean you don't sample each once per epoch right ? You can look into the |
Beta Was this translation helpful? Give feedback.
-
Thank you for your response. For metric learning sampling can be done in sets (e.g. pairs, triplets), so the length of epoch might be the permutation of all possible sets, which is absurdly large. For this reason it's typical to use an infinite stream approach while training, and structure code in terms of train steps instead of epochs. Another reason why you might not want to sample one datapoint per epoch is if you have a very uneven class distribution. However, it seems like the concept of an epoch is tightly coupled with Thanks again, to you, and the team for open sourcing this great project! |
Beta Was this translation helpful? Give feedback.
-
My workflow involves creating batches that are sampled fairly with respect to some metadata values (x, y, z) such that each batch is guaranteed to sample evenly from x, then evenly from y, then evenly from z. I currently achieve this using the 'batch_sampler' feature of 'torch.utils.data.dataloader' and I'm wondering how I might replicate this using FFCV.
Beta Was this translation helpful? Give feedback.
All reactions