Skip to content

Commit 7ddc21d

Browse files
EddyLXJmeta-codesync[bot]
authored andcommitted
Fix fused params for st publish umia hstu model (#3461)
Summary: Pull Request resolved: #3461 As title, for st publish, we need to pass tbe config into sharding model. Reviewed By: bobbyliujb, emlin Differential Revision: D79698284 fbshipit-source-id: 8236ab2df8c36c9b735e4c239d48b08f677ebe95
1 parent 781603c commit 7ddc21d

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

torchrec/distributed/batched_embedding_kernel.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -157,6 +157,7 @@ def _populate_ssd_tbe_params(config: GroupedEmbeddingConfig) -> Dict[str, Any]:
157157
Construct SSD TBE params dict from config and fused params dict.
158158
"""
159159
fused_params = config.fused_params or {}
160+
logger.info(f"Populate_ssd_tbe_params with {fused_params=}")
160161

161162
ssd_tbe_params: Dict[str, Any] = {}
162163

0 commit comments

Comments
 (0)