Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No convergence in onw dataset #73

Open
ghost opened this issue Jun 25, 2019 · 0 comments
Open

No convergence in onw dataset #73

ghost opened this issue Jun 25, 2019 · 0 comments

Comments

@ghost
Copy link

ghost commented Jun 25, 2019

Hello
I'm trying to train WGAN in order to increase the amount of images (3211 img) that I have from my own data-set. The size of my images are 1242x375, but only for testing purposes of this net I am resizing them to 256x256.
To add my own data-set, I just added this code lines to the main.py file:

elif opt.dataset == 'own': normalize = transforms.Normalize(mean=[0.485, 0.456, 0.406],std=[0.229, 0.224, 0.225]) dataset = dset.ImageFolder(root=opt.dataroot, transform=transforms.Compose([ transformaciones.Transformaciones(), transforms.RandomHorizontalFlip(0.1), transforms.Resize((opt.imageSize,opt.imageSize)), transforms.ToTensor(), normalize, ]))

Transformaciones is the data augmentation module that i have build.
I launch the training with 6 extra layers and 800 epoch.
The thing is that generator net only produces noise. I tried to use the generator net when the results were like lossG: 0.08 approx, and I only had noise in the images.
I am doing something wrong? Is because I have only a few images to train?
Someone has tried to use an own data-set and managed to made it work?

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

0 participants