-
Notifications
You must be signed in to change notification settings - Fork 27.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Falcon Models saved with save_pretrained
no longer get saved with python files
#24737
Comments
Hi @sgugger I checked the code snippet and indeed only config and model bin files are saved. (tested on main branch of July 10th)
|
This is expected as the config will keep references to where the code lives, you can see it has:
Saving then reloading with from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("tiiuae/falcon-7b-instruct", trust_remote_code=True)
model.save_pretrained("/path/to/save")
new_model = AutoModelForCausalLM.from_pretrained("/path/to/save", trust_remote_code=True) works. |
Hey @sgugger apologies for the misunderstanding you're right I was mistaken and over simplified the code snippet causing the issue; after taking another look I've realized that the issue is how I've downloaded the model. Rather than using
I first download the model locally with
if I inspect
which matches what is in the hub here: https://huggingface.co/tiiuae/falcon-7b-instruct/blob/main/config.json.
I get the error above. It may be that this is the expected behavior but it works fine with version My assumption is that this is issue is due to |
That I can reproduce. This should be fixed by the PR mentioned above. |
That's awesome thanks, just a question or two if that's alright so I can see if I understand what's going on here:
in case we are loading from a local trust remote code repo Then in the call to
get's executed which results in the modelling files being saved along with the the weights and config files. Is that correct? Edit: one other question is there a reason why this |
That's completely correct! As for the second question, I haven't deep-dived to make sure the two do exactly the same thing, but it's probably the same yes. This line is only there so that |
Thanks again for all your help really appreciate it! Tested this with your PR and works on my end for local falcon models! Also
Switching that line out to |
I think it would be fine if we add an |
Yeah would love to just want to make sure I understand the rationale behind adding |
Oh I completely misread your comment, thanks for asking a clarification. The test should be left as is, it is a consistency check, not an exist ok check. We can do the switch without adding anything. |
Ok makes sense; more than happy to still make a PR for that switch if it would be helpful |
Please go ahead! |
PR is linked above! One of us will have to rebase/fix conflicts as I've made these changes on top of main which hasn't incorporated your PR yet |
System Info
transformers
version: 4.30.2Who can help?
@ArthurZucker @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
When saving
tiiuae/falcon
models usingthe python files
configuration_RW.py
andmodelling_RW.py
are no longer saved. Loading the model withfrom_pretrained(...)
results in the following error:Expected behavior
To be able to load the model with
from_pretrained
after saving it withsave_pretrained
either by having the python files saved or pulling them from the hub.With transformers version =
4.27.4
usingsave_pretrained()
does actually save the python files and the saved model can be loaded right awayThe text was updated successfully, but these errors were encountered: