Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion cmake/external_projects/vllm_flash_attn.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ else()
FetchContent_Declare(
vllm-flash-attn
GIT_REPOSITORY https://github.com/vllm-project/flash-attention.git
GIT_TAG ee4d25bd84e0cbc7e0b9b9685085fd5db2dcb62a
GIT_TAG 4695e6bed5366c41e28c06cd86170166e4f43d00
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

For better maintainability, it's good practice to add a comment explaining what this Git commit hash corresponds to. This helps other developers understand why this specific commit is being used without having to look it up. You could mention the PRs this commit includes, as you did in the pull request description.

          # Includes fixes for FA3 build on CUDA 12.3, 12.4, and 12.5.
          # See https://github.com/vllm-project/flash-attention/pull/94
          GIT_TAG 4695e6bed5366c41e28c06cd86170166e4f43d00

GIT_PROGRESS TRUE
# Don't share the vllm-flash-attn build between build types
BINARY_DIR ${CMAKE_BINARY_DIR}/vllm-flash-attn
Expand Down