Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PR #8402: [XLA:CPU] [oneDNN] Enable Dot op (MatMul) in BF16 Type #8507

Merged
1 commit merged into from
Jan 17, 2024

Conversation

copybara-service[bot]
Copy link

PR #8402: [XLA:CPU] [oneDNN] Enable Dot op (MatMul) in BF16 Type

Imported from GitHub PR #8402

This PR adds BF16 support in oneDNN Matmul op by allowing the Dot op to maintain the BF16 type until handled by OneDnnMatMulRewriter pass.
Copybara import of the project:

--
4f7ddbc by Mahmoud Abuzaina mahmoud.abuzaina@intel.com:

Enable MatMul op in BF16

Merging this change closes #8402

FUTURE_COPYBARA_INTEGRATE_REVIEW=#8402 from Intel-tensorflow:mabuzain/enable-bf16-matmul 4f7ddbc

@copybara-service copybara-service bot force-pushed the test_598823232 branch 7 times, most recently from 178b7c9 to ce5a743 Compare January 17, 2024 12:19
Imported from GitHub PR #8402

This PR adds BF16 support in oneDNN Matmul op by allowing the Dot op to maintain the BF16 type until handled by OneDnnMatMulRewriter pass.
Copybara import of the project:

--
4f7ddbc by Mahmoud Abuzaina <mahmoud.abuzaina@intel.com>:

Enable MatMul op in BF16

Merging this change closes #8402

COPYBARA_INTEGRATE_REVIEW=#8402 from Intel-tensorflow:mabuzain/enable-bf16-matmul 4f7ddbc
PiperOrigin-RevId: 599132673
@copybara-service copybara-service bot closed this pull request by merging all changes into main in 1ecfc4f Jan 17, 2024
@copybara-service copybara-service bot deleted the test_598823232 branch January 17, 2024 12:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant