Skip to content

Commit c219a74

Browse files
authored
fix: Don't require efficiency_config for torchtune (#2104)
# What does this PR do? Revert a change that by mistake forced efficiency_config on torchtune provider users. ``` fix: Don't require efficiency_config for torchtune It was enforced by mistake when 0751a96 merged. Other asserts made sense in that the code was written, potentially, to always expect a non-None value. But not efficiency_config. ``` Signed-off-by: Ihar Hrachyshka <ihar.hrachyshka@gmail.com>
1 parent 7377a5c commit c219a74

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

llama_stack/providers/inline/post_training/torchtune/recipes/lora_finetuning_single_device.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,6 @@
3939
from llama_stack.apis.post_training import (
4040
Checkpoint,
4141
DataConfig,
42-
EfficiencyConfig,
4342
LoraFinetuningConfig,
4443
OptimizerConfig,
4544
QATFinetuningConfig,
@@ -90,8 +89,6 @@ def __init__(
9089
) -> None:
9190
assert isinstance(training_config.data_config, DataConfig), "DataConfig must be initialized"
9291

93-
assert isinstance(training_config.efficiency_config, EfficiencyConfig), "EfficiencyConfig must be initialized"
94-
9592
self.job_uuid = job_uuid
9693
self.training_config = training_config
9794
if not isinstance(algorithm_config, LoraFinetuningConfig):

0 commit comments

Comments
 (0)