Skip to content

Commit

Permalink
skip cuda
Browse files Browse the repository at this point in the history
  • Loading branch information
CuriousPanCake committed Nov 13, 2024
1 parent aabe792 commit aef6560
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/job_pytorch_layer_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -122,8 +122,8 @@ jobs:
- name: Install flash_attn module
run: |
# due to flash_attn issues, it needs to be installed separately from other packages
# pip install flash_attn --no-build-isolation
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.0.post1/flash_attn-2.7.0.post1+cu12torch2.5cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
export FLASH_ATTENTION_SKIP_CUDA_BUILD=TRUE
pip install flash_attn --no-build-isolation
- name: PyTorch Layer Tests
if: ${{ fromJSON(inputs.affected-components).PyTorch_FE.test && runner.arch != 'ARM64' }} # Ticket: 126287, 142196
Expand Down

0 comments on commit aef6560

Please sign in to comment.