Skip to content

Adjust flash-attn wheel selection for torch pin#16

Merged
cklxx merged 2 commits intocodex/optimize-training-time-for-context-parallelismfrom
modify-build_conda.sh-for-direct-installation-wkwuyr
Dec 22, 2025
Merged

Adjust flash-attn wheel selection for torch pin#16
cklxx merged 2 commits intocodex/optimize-training-time-for-context-parallelismfrom
modify-build_conda.sh-for-direct-installation-wkwuyr

Conversation

@cklxx
Copy link
Owner

@cklxx cklxx commented Dec 22, 2025

Summary

  • allow overriding the torch pin while deriving the matching flash-attn release wheel for the chosen torch minor version
  • clarify the default torch 2.4.1 selection for 4090 setups and simplify the slime install block

Testing

  • ruff check .
  • pytest (fails: ModuleNotFoundError for slime and test_qwen* modules in current environment during collection)

Codex Task

…nto modify-build_conda.sh-for-direct-installation-wkwuyr
@cklxx cklxx merged commit afad254 into codex/optimize-training-time-for-context-parallelism Dec 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant