Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapter Dropout #414

Closed
JensVN98 opened this issue Sep 6, 2022 · 1 comment · Fixed by #667
Closed

Adapter Dropout #414

JensVN98 opened this issue Sep 6, 2022 · 1 comment · Fixed by #667
Assignees
Labels
enhancement New feature or request

Comments

@JensVN98
Copy link

JensVN98 commented Sep 6, 2022

Hi!

Is there a reason as to why the dropout rate cannot be defined in certain configurations? For example, the dropout rate can be defined in LoraConfig, but not in the more "standard" configurations (AutoConfig, Pfeiffer, Houlsby...). I realise that LoRA and PrefixTuning differ from a methodological standpoint from the Houslby and Pfeiffer configs, but it would still be interesting to be able to add some sort of additional regularisation.

Thanks!

@JensVN98 JensVN98 added the question Further information is requested label Sep 6, 2022
@calpt
Copy link
Member

calpt commented Sep 9, 2022

Hi @JensVN98, I don't believe there's a specific reason for this other than that the original implementations didn't suggest adding a dropout rate. This shouldn't be difficult to add to the other configurations though.
Leaving this open as a feature request so we can add it in a coming update.

@calpt calpt added enhancement New feature or request and removed question Further information is requested labels Sep 9, 2022
@calpt calpt self-assigned this Apr 2, 2024
TimoImhof pushed a commit that referenced this issue Apr 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants