Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recompute: fix bug with transformer attention mask #34664

Commits on Aug 6, 2021

  1. 1 Configuration menu
    Copy the full SHA
    6dcdbc5 View commit details
    Browse the repository at this point in the history