Skip to content

Commit

Permalink
[usability] add hint for flash_attn
Browse files Browse the repository at this point in the history
  • Loading branch information
wheresmyhair committed Mar 6, 2025
1 parent 0faa453 commit 43ff307
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion src/lmflow/args.py
Original file line number Diff line number Diff line change
Expand Up @@ -363,7 +363,9 @@ def __post_init__(self):
if self.use_flash_attention:
if not is_flash_attn_available():
self.use_flash_attention = False
logger.warning("Flash attention is not available in the current environment. Disabling flash attention.")
logger.warning("Flash attention is not available in the current environment. Disabling flash attention. If you want to use flash attention, please install by `pip install -e '.[flash_attn]'`.")
else:
logger.warning("Flash attention is not enabled. We recommend enabling flash attention by `--use_flash_attention 1` for better performance.")

if self.lora_target_modules is not None:
self.lora_target_modules: List[str] = split_args(self.lora_target_modules)
Expand Down

0 comments on commit 43ff307

Please sign in to comment.