-
Notifications
You must be signed in to change notification settings - Fork 346
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add adapter support for Hubert #551
base: legacy
Are you sure you want to change the base?
Conversation
hidden_states = attn_residual + hidden_states | ||
hidden_states = hidden_states + self.feed_forward(self.final_layer_norm(hidden_states)) | ||
sa_output = self.dropout(sa_output) | ||
sa_output = self.attention_adapters(sa_output, attn_residual) # (bs, seq_length, dim) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll change it to sa_output = self.attention_adapters(sa_output, attn_residual, None)
instead
Hey, thanks for your efforts in contributing new model architectures to adapter-transformers and sorry for the silence on our side. In the last few weeks, we've been working on a large refactoring of our project, which will ultimately result in the release of Adapters, the next-generation adapters library. See #584. As a consequence, we plan to merge any new model integrations directly to the new codebase, which currently can be found on this branch. Unfortunately, this necessitates some changes in the model integration code (detailed here, see already integrated models such as BERT, BART etc. for reference). If you'd be willing to update your model integration to target the new library yourself, we'd be super happy to help you on this. Otherwise, we might look into upgrading and merging some of the open model integration PRs ourselves in the future. For more details, again see #584. |
This PR adds adapter support for Hubert by Meta. I have a few questions:
Should I also add support for LoRA? It expectsAdded support for LoRA too. Let me know if it looks fine?PretrainedConfig
inLoRALinear
which is not imported inmodeling_hubert.py
file. Should I add that and then support LoRA too?adjust_tensors_for_parallel
is used for what purpose? I see you have used this while calculating attention.src/transformers/adapters/models/hubert/adapter_model.py
there are many heads and for this ASR model I found onlyClassificationHead
,ModelWithFlexibleHeadsAdaptersMixin
,MultiLabelClassificationHead
, andMultipleChoiceHead
useful. Let me know if that's fine else I can add head for other tasks tooI'm working on tests, will finalise after first review. Thanks a lot!