Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure reference from adapter layer to layer norm is always correct #228

Merged
merged 1 commit into from
Sep 7, 2021

Conversation

calpt
Copy link
Member

@calpt calpt commented Sep 3, 2021

Fixes errors with torch DataParallel.
Closes #227.

@calpt calpt changed the title Ensure reference adapter layer to layer norm is always correct Ensure reference from adapter layer to layer norm is always correct Sep 3, 2021
@calpt calpt requested a review from hSterz September 3, 2021 14:43
@calpt calpt marked this pull request as ready for review September 3, 2021 14:43
@calpt calpt merged commit 046c658 into adapter-hub:master Sep 7, 2021
@calpt calpt deleted the fix/data_parallel branch September 7, 2021 16:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

AdapterTransformer with Pytorch 1.9 fails for multi-GPU
2 participants