Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable PeftConfig & PeftModel to load from revision #433

Merged
merged 3 commits into from
Jun 1, 2023

Conversation

lewtun
Copy link
Member

@lewtun lewtun commented May 11, 2023

This PR enables the from_pretrained of PeftConfig and PeftModel to load configs/models with a specific revision arg. This is useful when storing many adapter weights as branches in a model repo.

Sample usage

from peft import PeftConfig, PeftModel
from transformers import AutoModelForCausalLM

model_id = "lewtun/tiny-random-OPTForCausalLM-delta"
revision = "v1"
config = PeftConfig.from_pretrained(model_id, revision=revision)

model = AutoModelForCausalLM.from_pretrained(
    config.base_model_name_or_path,
)
model = PeftModel.from_pretrained(model, model_id, revision=revision)
assert isinstance(model, PeftModel)

@lewtun lewtun requested a review from pacman100 May 11, 2023 08:20
@@ -19,6 +19,9 @@
from peft import AdaptionPromptConfig, LoraConfig, PrefixTuningConfig, PromptEncoderConfig, PromptTuningConfig


PEFT_MODELS_TO_TEST = [("lewtun/tiny-random-OPTForCausalLM-delta", "3cedab206cbe8e22fc764a96481e994e6868fe7c")]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're happy with this approach, I'll move the model to hf-internal-testing

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented May 11, 2023

The documentation is not available anymore as the PR was closed or merged.

@lewtun lewtun changed the title Enable PeftConfig to load from revision Enable PeftConfig & PeftModel to load from revision May 11, 2023
@@ -19,6 +19,9 @@
from peft import AdaptionPromptConfig, LoraConfig, PrefixTuningConfig, PromptEncoderConfig, PromptTuningConfig


PEFT_MODELS_TO_TEST = [("lewtun/tiny-random-OPTForCausalLM-delta", "v1")]
Copy link
Member Author

@lewtun lewtun May 11, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that this repo only has weights in the v1 branch to properly test the revision arg works as expected

Copy link
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @lewtun for improving the load and save utils! 🤗

@pacman100 pacman100 merged commit 76d4ecd into main Jun 1, 2023
@winglian
Copy link
Contributor

winglian commented Jun 1, 2023

@pacman100 This causes a regression when device_map is used for PeftModel.from_pretrained. Probably not a great idea to pass all **kwargs blindly from PeftModel to PeftConfig.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants