Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix DeBERTa prefix tuning w. enabled relative attention #451

Merged

Conversation

calpt
Copy link
Member

@calpt calpt commented Nov 9, 2022

Current implementation of prefix tuning is incompatible with relative attention of DeBERTa/ DeBERTa v2 (relative_attention=True), i.e. throws errors on position-to-content & content-to-position attentions. This PR removes prefix tokens from relative attention computation.

@calpt calpt force-pushed the fix/deberta_disentangled_prefix_tuning branch from 9fc7cbc to 14dfdb0 Compare November 10, 2022 14:23
@calpt calpt marked this pull request as ready for review November 11, 2022 17:03
@calpt calpt requested a review from hSterz November 23, 2022 09:17
Copy link
Member

@hSterz hSterz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good 👍

@calpt calpt merged commit d713de3 into adapter-hub:master Nov 24, 2022
@calpt calpt deleted the fix/deberta_disentangled_prefix_tuning branch November 24, 2022 14:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants