Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Test] Fix redirected links issue #18542

Closed

Conversation

younesbelkada
Copy link
Contributor

@younesbelkada younesbelkada commented Aug 9, 2022

What does this PR do?

This PR tries to address the issue of loading a model when the original link is redirected. This happened for BLOOM models where the repo ids has been changed but the code does not take into account redirected links.
I am not sure how to properly test if this does not break anything so I am putting this PR as a test PR, so feel free to ignore it.
Now loading BLOOM models with old naming works

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "bigscience/bloom-350m"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

Also this can be done probably on the huggingface_hub level but I am not sure

Related to #18531

- get redirected link and replace it with correct repo id
- use BaseException to catch all errors
- make style
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Aug 9, 2022

The documentation is not available anymore as the PR was closed or merged.

- use `BaseException`
- use `.replace` instead of complicated `.split()`
@julien-c
Copy link
Member

julien-c commented Aug 9, 2022

This should be fixed upstream (there's an open issue IIRC)

@younesbelkada
Copy link
Contributor Author

younesbelkada commented Aug 9, 2022

Okay I see! I think that you are referring to this issue: #17582 posting it here for visibility!
I can't see any related PR to this issue for now, maybe it is hidden in another PR?

EDIT: it will be fixed once transformers will use huggingface_hub behind the scenes for loading the models

@sgugger
Copy link
Collaborator

sgugger commented Aug 9, 2022

It should be fixed on the Hugging Face Hub side at this stage (the issue reported incorrectly that it works for huggingfCe_hub tools but it does not), there is nothing left to do in Transformers.

@julien-c
Copy link
Member

julien-c commented Aug 9, 2022

note that in the meantime you can always opt to re-rename your repos if it's a big issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants