Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fails when batch is an odd number #1

Closed
golunovas opened this issue Nov 4, 2020 · 3 comments
Closed

Fails when batch is an odd number #1

golunovas opened this issue Nov 4, 2020 · 3 comments

Comments

@golunovas
Copy link

It seems like the issue is coming from here.

As far as I understand, it requires an even batch otherwise it fails here

interpolated = alpha * real + (1 - alpha) * fake

@creafz
Copy link
Contributor

creafz commented Nov 4, 2020

Hey, @golunovas

Could you please provide a use case for the odd batch_size? It seems that the best way to handle that problem is to include an explicit check into AutoAlbument which ensures that batch_size is an odd number before running a search phase. Otherwise, I think some problems related to the different number of augmented and not-augmented images may arise.

@golunovas
Copy link
Author

Well, I ran into that issue on the last batch of the epoch when I just ran a search on the generated config search.yaml where drop_last wasn't set to true for the dataloader and I had an odd number of samples in the dataset. But the odd batch size will lead to exactly the same issue. IMO, the easiest solution is to set drop_last to true by default and require an even batch size to be set in the config.

@creafz
Copy link
Contributor

creafz commented Nov 4, 2020

I have added the drop_last: True parameter to config files created by autoalbument-create. Example configs now also contain this parameter. The fixed version 0.0.4 is also uploaded to PyPI.

@creafz creafz closed this as completed Nov 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants