Skip to content
This repository has been archived by the owner on Nov 3, 2023. It is now read-only.

Attention dropout in transformers. #1379

Merged
merged 1 commit into from
Jan 23, 2019
Merged

Attention dropout in transformers. #1379

merged 1 commit into from
Jan 23, 2019

Conversation

stephenroller
Copy link
Contributor

No description provided.

Copy link
Contributor

@klshuster klshuster left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants