Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PEFT] Pass token when calling find_adapter_config #26488

Merged
merged 3 commits into from
Oct 2, 2023

Conversation

younesbelkada
Copy link
Contributor

@younesbelkada younesbelkada commented Sep 29, 2023

What does this PR do?

This PR fixes an issue that was reported on Spaces. I was not able to reproduce the issue locally though.
When loading a model with token=True (i.e. on a gated or private repository), find_adapter_config will try to look for an adapter file inside a private repository without the token, leading to an authentication error.

The fix is to pass the token to adapter_kwargs and remove the duplication here: https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py#L2529

cc @LysandreJik

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Sep 29, 2023

The documentation is not available anymore as the PR was closed or merged.

@abhishekkrthakur
Copy link
Member

This seems to fix the issue for me. Shall we merge it? Or figure out the actual cause?

@younesbelkada younesbelkada marked this pull request as ready for review September 29, 2023 10:48
@younesbelkada younesbelkada changed the title [DO NOT MERGE] Pass token when calling find_adapter_config [PEFT] Pass token when calling find_adapter_config Sep 29, 2023
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@LysandreJik LysandreJik merged commit 24178c2 into huggingface:main Oct 2, 2023
@denys-fridman
Copy link

denys-fridman commented Oct 2, 2023

@younesbelkada adapter_kwargs can be None, in which case the code breaks. Here it can be set to None, then it passed via this line.

For reference, I was running

python main.py \
  --model path_to_mistral_7B \
  --tasks mbpp \
  --temperature 0.1 \
  --n_samples 15 \
  --batch_size 4 \
  --precision fp16 \
  --allow_code_execution \
  --save_generations \
  --max_length_generation 512 \
  --save_generations_path predictions.json \
  --metric_output_path metrics.json

@younesbelkada
Copy link
Contributor Author

Nice catch! Will open a PR soon to fix that

blbadger pushed a commit to blbadger/transformers that referenced this pull request Nov 8, 2023
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants