Skip to content

Commit c976c97

Browse files
tianyu-lgithubsgi
authored andcommitted
[refactor] graduate custom_config_module and unify args/config naming (pytorch#1871)
In the past, the terms "args" and "config" have been used in a mix. To make it unambiguous, in torchtitan we use - "args" as in `ModelArgs` to refer to parameters used to define a model in model code - "config" as in `JobConfig` to refer to configurable training job commands used in training script This also PR also moves `custom_args_module` (which should be `custom_config_module` according to the naming rule above) from `Experimental` to `Job`, as it has been extensively used by various models in torchtitan, especially those in the `experiments` folder.
1 parent ac71851 commit c976c97

File tree

2 files changed

+0
-6
lines changed

2 files changed

+0
-6
lines changed

torchtitan/models/llama3_ft/__init__.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,6 @@
2222
"Transformer",
2323
"llama3_configs",
2424
]
25-
>>>>>>> cd16507b (move PP API to model agnostic file (#1868))
2625

2726

2827
def get_train_spec() -> TrainSpec:

torchtitan/models/llama3_ft/train_configs/debug_model.toml

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,8 @@
11
[job]
22
dump_folder = "./outputs"
3-
<<<<<<< HEAD
43
description = "Llama 3 fault-tolerant debug training"
54
print_config = false
65
custom_config_module = "torchtitan.components.ft.config"
7-
=======
8-
description = "Llama 3 debug training"
9-
print_args = false
10-
>>>>>>> f63037ff (add script to train with ft (#1812))
116

127
[profiling]
138
enable_profiling = true

0 commit comments

Comments
 (0)