-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keep the setting of user created DataLoader in replacing DistributedSampler #4650
Comments
Hi! thanks for your contribution!, great first issue! |
I think this is the ideal way to handle this. Else we will make things more complicated. |
But more I concern is the side effect of the default setting
I totally agree with you. If this is a user responsibility to handle it, I suggest to make a brief description about |
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team! |
🚀 Feature
Motivation
As mention at #2789, the default behavior of
replace_sampler_ddp
is creating a newDistributedSampler
. Theshuffle
setting depends on the kind of dataloader (train or val/test dataloader). However, this behavior override the setting of user defined dataloader, such asshuffle
ordrop_last
. A more reasonable solution is to get this setting direly from user created dataloader, and apply the same setting inDistributedSampler
.Pitch
For example, we can get the
shuffle
setting formdataloader.sampler
. If this is a instance ofSequentialSampler
,shuffle=False
.Alternatives
Set
replace_sampler_ddp=False
, and handle it by hand.Additional context
The text was updated successfully, but these errors were encountered: