-
Notifications
You must be signed in to change notification settings - Fork 811
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to fix the order of data in iterator during training step? #828
Comments
@bencwallace mentioned this temporary fix. #522 (comment) In the end, we will switch to |
Thanks! I'm using
where |
Strange, I think the work-around should work for By the way, I'm pretty sure you're right that seeding and then calling |
That's somewhat disconcerting. Please let me know if you find out the problem! I'm trying to maintain reproducibility in a project of my own. |
❓ Questions and Help
Description
Currently, I'm running experiments with several datasets in torchtext, and I just found that I can't reproduce my experiments although I excluded all the possible randomness as following:
I found that, when Iterator class is initialized,
RandomShuffler()
defined in torchtext.data.utils is set as aself.random_shuffler
, and this is used to shuffle data in training dataset. However, although one can set random state ofRandomShuffler
by feeding it as an argument of it, the lineself.random_shuffler = RandomShuffler()
doesn't let us to manually set the random state of it. Am I right? Is there a way to fix the order of data for training step?The text was updated successfully, but these errors were encountered: