Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix add_weighted_adapter method #1169

Merged
merged 2 commits into from
Nov 22, 2023
Merged

Conversation

pacman100
Copy link
Contributor

Co-Authored-By: Benjamin Bossan <BenjaminBossan@users.noreply.github.com>
Co-Authored-By: jihuishan <151612440+jihuishan@users.noreply.github.com>
@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Nov 22, 2023

The documentation is not available anymore as the PR was closed or merged.

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense, thanks!

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the quick fix.

I guess, this could be considered not backwards compatible, because the same code would produce different outcomes now. I think this is fine in this case, as "linear" is not the default and as the previous behavior could be considered a "bug". But we should be sure to highlight this change in the next release notes.

@pacman100
Copy link
Contributor Author

pacman100 commented Nov 22, 2023

But we should be sure to highlight this change in the next release notes.

Yes, noted.

@pacman100 pacman100 merged commit 0432385 into main Nov 22, 2023
14 checks passed
@pacman100 pacman100 deleted the smangrul/fix-weighted-adapter-fn branch November 28, 2023 11:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

A (possible) bug in lora merging method: add_weighted_adapter
4 participants