Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is hparams exactly? #651

Closed
keunwoochoi opened this issue Dec 26, 2019 · 6 comments
Closed

What is hparams exactly? #651

keunwoochoi opened this issue Dec 26, 2019 · 6 comments
Labels
question Further information is requested

Comments

@keunwoochoi
Copy link

Hi, thanks for the nice product again.

From #525 and #599, I could guess that hparams is required to load a saved model (which I think should be mentioned somewhere in the doc btw). And from the examples, seems like hparams may be argparse.Namespace. Unfortunately though, it was not so easy to understand the concept.

What is hparams exactly? What kind of information it should/can/should not include to work properly? Is it recommended to use hyperparameter argument parser? Say, if I'm not into hyperparameter search at the moment and just want to be able to load the checkpoint model, what is the requirement on the hparams?

@keunwoochoi keunwoochoi added the question Further information is requested label Dec 26, 2019
@s-rog
Copy link
Contributor

s-rog commented Jan 2, 2020

hparams is just a dict that contains your hyperparameters for easy logging and experiments. It is optional, but does make loading models more tedious.

@neggert
Copy link
Contributor

neggert commented Jan 2, 2020

It should be an argparse.Namespace. You can get this from argparse, testtube's HyperOptArgumentParser, or create it manually from a dict like so: argparse.Namespace(**my_dict).

@keunwoochoi
Copy link
Author

@s-rog Thanks, but it's not a dict :-)
@neggert Thanks. Yes those are the info I could assume by reading code etc. But I think those and probably more details should be documented somewhere. I'd like to contribute, but I can't document something I don't know clearly.
Also, in my opinion, it should be very strongly recommended to use it so that they can load the model later, assuming the use cases of pytorch-lightning would be more serious than just training something on MNIST. Because, for those toy example, why bother :)

@s-rog
Copy link
Contributor

s-rog commented Jan 3, 2020

oh yeah I keep forgetting, you're right!

@neggert
Copy link
Contributor

neggert commented Jan 3, 2020

Yes, there's a complete overhaul of the docs in progress. We'll make sure this gets in there.

@williamFalcon
Copy link
Contributor

williamFalcon commented Jan 15, 2020

thanks for the input!
Once we have the new docs, please let us know of any problems! :)

@Lightning-AI Lightning-AI locked and limited conversation to collaborators Feb 4, 2021

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants