Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix the import error of cached_path, hf_bucket_url #27

Merged
merged 1 commit into from
Aug 31, 2022

Conversation

sywangyi
Copy link
Collaborator

from transformers.file_utils import cached_path, hf_bucket_url

to align with the following PR merged in transformer
huggingface/transformers#18497

Signed-off-by: Wang, Yi A yi.a.wang@intel.com

What does this PR do?

Fixes # (issue)
fix the import error of cached_path, hf_bucket_url
to align with the following PR merged in transformer
huggingface/transformers#18497

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@sywangyi
Copy link
Collaborator Author

@jiqing-feng @echarlaix @PenghuiCheng please aware of the issue

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Aug 10, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix @sywangyi !

Comment on lines 21 to 28
try:
from transformers.utils import cached_file

_use_cached_file = True
except ImportError:
from transformers.file_utils import cached_path, hf_bucket_url

_use_cached_file = False
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
try:
from transformers.utils import cached_file
_use_cached_file = True
except ImportError:
from transformers.file_utils import cached_path, hf_bucket_url
_use_cached_file = False
from huggingface_hub import hf_hub_download

Comment on lines 108 to 114
if _use_cached_file == False:
if os.path.isdir(config_name_or_path):
config_file = os.path.join(config_name_or_path, config_file_name)
elif os.path.isfile(config_name_or_path):
config_file = config_name_or_path
else:
config_file = hf_bucket_url(config_name_or_path, filename=config_file_name, revision=revision)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest using hf_hub_download as follows :

config_file_name = config_file_name if config_file_name is not None else CONFIG_NAME
if os.path.isdir(config_name_or_path):
    config_file = os.path.join(config_name_or_path, config_file_name)
elif os.path.isfile(config_name_or_path):
    config_file = config_name_or_path
else:
    try:
        config_file = hf_hub_download(
            repo_id=config_name_or_path,
            filename=config_file_name,
            revision=revision,
            cache_dir=cache_dir,
            force_download=force_download,
        )
    except EnvironmentError as err:
        logger.error(err)
        msg = (
            f"Can't load config for '{config_name_or_path}'. Make sure that:\n\n"
            f"-'{config_name_or_path}' is a correct model identifier listed on 'https://huggingface.co/models'\n\n"
            f"-or '{config_name_or_path}' is a correct path to a directory containing a {config_file_name} file\n\n"
        )

        if revision is not None:
            msg += (
                f"- or '{revision}' is a valid git identifier (branch name, a tag name, or a commit id) that "
                f"exists for this model name as listed on its model page on 'https://huggingface.co/models'\n\n"
            )

        raise EnvironmentError(msg)

config = cls(config_file)

optimum/intel/neural_compressor/quantization.py Outdated Show resolved Hide resolved
from transformers.file_utils import cached_path, hf_bucket_url

to align with the following PR merged in transformer
huggingface/transformers#18497

Signed-off-by: Wang, Yi A <yi.a.wang@intel.com>
Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix @sywangyi ! 🔥

@echarlaix echarlaix merged commit 1a2e42f into huggingface:main Aug 31, 2022
@sywangyi sywangyi deleted the align_hub_change branch December 19, 2022 01:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants