Switch from cached_download
to hf_hub_download
in remaining occurrences
#31284
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
(somehow related to huggingface/huggingface_hub#2317)
This PR get rids of the remaining
cached_download
in favor of the more robusthf_hub_download
. Cached_download is quite legacy (replaced since Aug 2022) and I'd like to completely drop it at some point. The core part oftransformers
's logic made the switch quite some time ago already.Main change is basically to go from
to
I used
Path(...).read_text()
to avoid opening the file without closing it.Note: there is still one occurrence of
cached_download
in ./src/transformers/models/deprecated/van/convert_van_to_pytorch.py but since it is in adeprecated/
folder, I did not bother maintaining it.