Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

checkpoint question #103

Open
RihardsVitols opened this issue Sep 11, 2020 · 5 comments
Open

checkpoint question #103

RihardsVitols opened this issue Sep 11, 2020 · 5 comments

Comments

@RihardsVitols
Copy link

dear all,

can some one tel me how i can load checkpoint in to train.py?

i was trying this:
python train.py --mixing lmdb --ckpt checkpoint/010000.model

resoult:
File "train.py", line 321, in
generator.module.load_state_dict(ckpt['generator'])
KeyError: 'generator'

if i try with out .model file i get:
PermissionError: [Errno 13] Permission denied: 'checkpoint'

thank you

@rosinality
Copy link
Owner

Only train_step-X.model checkpoints have generator and discriminator weights. Other checkpoints only have weight of the ema of the generator.

@RihardsVitols
Copy link
Author

Thank you.

and thank you for sharing this code.

@quyet0nguyen
Copy link

@rosinality i don't know why the purpose why you save the 010000.model checkpoint and train_step-X.model. Because when I trained model after a break, the model started training from the begining (from 8x8 resolutions) instead of saved state dict.

Thank for your code very much.

@rosinality
Copy link
Owner

@quyet0nguyen You can use --init_size argument to beginning for more larger resolutions.

@quyet0nguyen
Copy link

quyet0nguyen commented Apr 6, 2021

@rosinality thanks for your answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants