Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue with unloading double wrapped modules #1490

Merged

Conversation

BenjaminBossan
Copy link
Member

Resolves #1485, but note that some additional solutions are mentioned in thet issue.

This checks that when unloading a PEFT model, if the ModulesToSaveWrapper contains a tuner module, it is correctly unloaded. The unloaded model should not have PEFT layers at the end.

Resolves huggingface#1485, but note that some additional solutions are mentioned in
thet issue.

This checks that when unloading a PEFT model, if the
ModulesToSaveWrapper contains a tuner module, it is correctly unloaded.
The unloaded model should not have PEFT layers at the end.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice fix, thank you !

@BenjaminBossan BenjaminBossan merged commit f811472 into huggingface:main Feb 20, 2024
14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-unloading-double-wrapped branch February 20, 2024 14:12
BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Mar 14, 2024
Resolves huggingface#1485, but note that some additional solutions are mentioned in
thet issue.

This checks that when unloading a PEFT model, if the
ModulesToSaveWrapper contains a tuner module, it is correctly unloaded.
The unloaded model should not have PEFT layers at the end.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

all-linear + classification models have double-wrapped linear layers
3 participants