-
Notifications
You must be signed in to change notification settings - Fork 339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why is Pararallel Composition blocked for XLM-Roberta #282
Comments
Edit
Edit 2 |
Can we add |
Hi @jinyongyoo, you're right, I think there's no reason why XLM-Roberta shouldn't be on that list. Adding in #305. |
Parallel composition of adapters are only supported for a few model types listed here:
https://github.com/Adapter-Hub/adapter-transformers/blob/9a6bf1757b684a4c627c5a35a56e61ea706dccee/src/transformers/adapters/composition.py#L101-L103
But I believe XLM-Roberta has the same architecture as Roberta, so I think parallel composition will also work with XLM-Roberta models. Is it possible to use parallel composition with XLM-Roberta models?
The text was updated successfully, but these errors were encountered: