Skip to content

Conversation

@danielvegamyhre
Copy link
Contributor

Fixes #1531

Test plan

  • NGPU=8 CONFIG_FILE="./torchtitan/models/deepseek_v3/train_configs/debug_model.toml" ./run_train.sh --model.name deepseek_v3 --parallelism.tensor_parallel_degree 2

cc @tianyu-l

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Aug 6, 2025
@danielvegamyhre
Copy link
Contributor Author

fyi @ruisizhang123 as well

@danielvegamyhre danielvegamyhre changed the title [dsv3] store expert weights such that we can transpose before grouped… [dsv3] store expert weights such that we can transpose before grouped gemm to have column-major memory layout Aug 6, 2025
Copy link
Contributor

@tianyu-l tianyu-l left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should be covered by #1534

@danielvegamyhre
Copy link
Contributor Author

I think this should be covered by #1534

Ok, we can close this in favor of that more comprehensive PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bugs in DeepSeek v3 FSDP + TP

2 participants