Skip to content

flash_attention: support also cross attention. #10462

flash_attention: support also cross attention.

flash_attention: support also cross attention. #10462

Re-run triggered December 2, 2024 09:07
Status Success
Total duration 1h 55m 31s
Artifacts 3

build_and_test.yml

on: pull_request
get-torch-commit
2s
get-torch-commit
Build PyTorch/XLA  /  build
1h 9m
Build PyTorch/XLA / build
Matrix: CPU tests / test
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
cpp-test-bin
661 MB
github-pages
5.67 MB
torch-xla-wheels
222 MB