-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
checkpoint question #103
Comments
Only train_step-X.model checkpoints have generator and discriminator weights. Other checkpoints only have weight of the ema of the generator. |
Thank you. and thank you for sharing this code. |
@rosinality i don't know why the purpose why you save the 010000.model checkpoint and train_step-X.model. Because when I trained model after a break, the model started training from the begining (from 8x8 resolutions) instead of saved state dict. Thank for your code very much. |
@quyet0nguyen You can use |
@rosinality thanks for your answer. |
dear all,
can some one tel me how i can load checkpoint in to train.py?
i was trying this:
python train.py --mixing lmdb --ckpt checkpoint/010000.model
resoult:
File "train.py", line 321, in
generator.module.load_state_dict(ckpt['generator'])
KeyError: 'generator'
if i try with out .model file i get:
PermissionError: [Errno 13] Permission denied: 'checkpoint'
thank you
The text was updated successfully, but these errors were encountered: