Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapter Support For the Longformer family #459

Open
gabinguo opened this issue Dec 9, 2022 · 2 comments
Open

Adapter Support For the Longformer family #459

gabinguo opened this issue Dec 9, 2022 · 2 comments
Labels
enhancement New feature or request

Comments

@gabinguo
Copy link

gabinguo commented Dec 9, 2022

🚀 Feature request

Adapter Support For the Longformer models

Motivation

For question-answering over long documents, especially when the answers are long (surpass the 384 or 512 max sequence length supported by other LMs e.g. BERT, RoBERTa... It is impossible for users to get the correct answers...

So it seems necessary for us to use longformer or bigbird models which accept longer input sequences... however longer sequences will lead to a increase of fine-tuning computation..

I read the paper about adapter-transformer and I found this solution to be elegant and perfectly mitigate the problems of long-answer question answering.

Unfortunately, there is not yet a support in the adapter library the supports for longformer architecture. I would like to ask if you are planning a support for the longformer models?

Thanks : )

@gabinguo gabinguo added the enhancement New feature or request label Dec 9, 2022
@gabinguo
Copy link
Author

mentioned in #442

@logvinata
Copy link

I agree, this would be quit useful!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants