Skip to content

Commit 6648707

Browse files
tianyu-lgithubsgi
authored andcommitted
[refactor] graduate custom_config_module and unify args/config naming (pytorch#1871)
In the past, the terms "args" and "config" have been used in a mix. To make it unambiguous, in torchtitan we use - "args" as in `ModelArgs` to refer to parameters used to define a model in model code - "config" as in `JobConfig` to refer to configurable training job commands used in training script This also PR also moves `custom_args_module` (which should be `custom_config_module` according to the naming rule above) from `Experimental` to `Job`, as it has been extensively used by various models in torchtitan, especially those in the `experiments` folder.
1 parent d0b1987 commit 6648707

File tree

3 files changed

+0
-10
lines changed

3 files changed

+0
-10
lines changed

torchtitan/experiments/simple_fsdp/llama3/__init__.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,11 +10,7 @@
1010
from torchtitan.components.tokenizer import build_hf_tokenizer
1111
from torchtitan.datasets.hf_datasets import build_hf_dataloader
1212
from torchtitan.distributed.pipeline_parallel import pipeline_llm
13-
<<<<<<< HEAD
1413
from torchtitan.models.llama3 import llama3_args
15-
=======
16-
from torchtitan.models.llama3 import llama3_configs
17-
>>>>>>> cd16507b (move PP API to model agnostic file (#1868))
1814
from torchtitan.protocols.train_spec import TrainSpec
1915

2016
from .model import SimpleFSDPTransformer

torchtitan/models/llama3_ft/__init__.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,6 @@
2222
"Transformer",
2323
"llama3_configs",
2424
]
25-
>>>>>>> cd16507b (move PP API to model agnostic file (#1868))
2625

2726

2827
def get_train_spec() -> TrainSpec:

torchtitan/models/llama3_ft/train_configs/debug_model.toml

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,8 @@
11
[job]
22
dump_folder = "./outputs"
3-
<<<<<<< HEAD
43
description = "Llama 3 fault-tolerant debug training"
54
print_config = false
65
custom_config_module = "torchtitan.components.ft.config"
7-
=======
8-
description = "Llama 3 debug training"
9-
print_args = false
10-
>>>>>>> f63037ff (add script to train with ft (#1812))
116

127
[profiling]
138
enable_profiling = true

0 commit comments

Comments
 (0)