Skip to content

Commit

Permalink
remove flash_attn
Browse files Browse the repository at this point in the history
  • Loading branch information
CuriousPanCake committed Nov 14, 2024
1 parent aef6560 commit 32aaabb
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 7 deletions.
6 changes: 0 additions & 6 deletions .github/workflows/job_pytorch_layer_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -119,12 +119,6 @@ jobs:
# pytorch test requirements
python3 -m pip install -r ${{ env.INSTALL_TEST_DIR }}/requirements_pytorch
- name: Install flash_attn module
run: |
# due to flash_attn issues, it needs to be installed separately from other packages
export FLASH_ATTENTION_SKIP_CUDA_BUILD=TRUE
pip install flash_attn --no-build-isolation
- name: PyTorch Layer Tests
if: ${{ fromJSON(inputs.affected-components).PyTorch_FE.test && runner.arch != 'ARM64' }} # Ticket: 126287, 142196
# due to CVS-152795, parallel run is not possible on Windows
Expand Down
2 changes: 1 addition & 1 deletion tests/requirements_pytorch
Original file line number Diff line number Diff line change
Expand Up @@ -54,4 +54,4 @@ rjieba==0.1.11
# - katuni4ka/tiny-random-qwen
# - katuni4ka/tiny-random-internlm2
transformers_stream_generator==0.0.5
einops==0.8.0
einops==0.8.0

0 comments on commit 32aaabb

Please sign in to comment.