We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 7453652 commit 4f7b00dCopy full SHA for 4f7b00d
vllm/attention/backends/utils.py
@@ -612,5 +612,5 @@ def flash_attn_version():
612
return fa_version
613
614
VLLM_FLASH_ATTN_VERSION = flash_attn_version()
615
-except ImportError:
+except (ImportError, AssertionError):
616
VLLM_FLASH_ATTN_VERSION = None
0 commit comments