Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add EncoderDecoderModel support #203

Closed
joao-alves97 opened this issue Jul 8, 2021 · 7 comments · Fixed by #222
Closed

Add EncoderDecoderModel support #203

joao-alves97 opened this issue Jul 8, 2021 · 7 comments · Fixed by #222
Assignees
Labels
enhancement New feature or request

Comments

@joao-alves97
Copy link

🚀 Feature request

Could it be possible to add adapters to the EncoderDecoderModel?
I am fine-tuning an EncoderDecoderModel with two mBERT and I would like to compare it with fine-tuning only the adapter layers

@joao-alves97 joao-alves97 added the enhancement New feature or request label Jul 8, 2021
@joao-alves97
Copy link
Author

@patil-suraj

@calpt
Copy link
Member

calpt commented Aug 16, 2021

Hey @joao-alves97, I started an adapter implementation for EncoderDecoderModel in https://github.com/calpt/adapter-transformers/tree/dev/encoder_decoder (currently work in progress). Expecting it to be available soon.

@calpt calpt self-assigned this Aug 16, 2021
@joao-alves97
Copy link
Author

Awesome! Could you send me a message once it is available? Thanks

@calpt
Copy link
Member

calpt commented Sep 10, 2021

@joao-alves97 the EncoderDecoderModel implementation has been merged into master, so you should be able to use it when installing from there. We haven't done any extensive evaluation yet though, so happy to hear about results you get :)

@joao-alves97
Copy link
Author

Awesome! Next week I'm going to try to do some experiments!

@joao-alves97
Copy link
Author

@calpt sorry for replying after 2 months but I'm trying to use adapters on top of a EncoderDecoderModel with two XLMR for translation but I'm always facing this error:
The model is not freezed. For training adapters please call the train_adapters() method
I'm using the method model.train_adapter(adapter_name).
Any idea on how to solve this problem?

@calpt
Copy link
Member

calpt commented Nov 24, 2021

@joao-alves97 This error doesn't sound expected. Would you mind opening a new (bug report) issue for this (ideally with a short snippet for us to reproduce)? Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants