Skip to content

Commit

Permalink
πŸ› Style Transfer: use 1 GPU because NCCL freeze using VLLM inference.
Browse files Browse the repository at this point in the history
  • Loading branch information
simonmeoni committed Oct 15, 2024
1 parent 7c13bcc commit 37d42fe
Show file tree
Hide file tree
Showing 4 changed files with 6 additions and 6 deletions.
6 changes: 3 additions & 3 deletions lib/style-transfer/configs/rb_gen/dpo/default.yaml
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
training_args:
_target_: trl.DPOConfig
per_device_train_batch_size: 4
per_device_train_batch_size: 1
logging_steps: 2
save_steps: 50
gradient_accumulation_steps: 4
gradient_accumulation_steps: 16
gradient_checkpointing: false
learning_rate: 2e-6
weight_decay: 1e-7
eval_strategy: "no"
num_train_epochs: 15
num_train_epochs: 10
output_dir: "models/dpo"
optim: "adafactor"
save_only_model: true
Expand Down
2 changes: 1 addition & 1 deletion lib/style-transfer/configs/rb_gen/gen/default.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ sampling_params:
top_p: 1.0
llm:
_target_: vllm.LLM
_partial_: true
model: "models/merged/"
speculative_model: "[ngram]"
num_speculative_tokens: 5
ngram_prompt_lookup_max: 4
use_v2_block_manager: true
tensor_parallel_size: 1
2 changes: 1 addition & 1 deletion lib/style-transfer/configs/rb_gen/train.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -36,4 +36,4 @@ dataset:
num_generated_samples: 6

seed: 666
max_steps: 1
max_steps: 2
2 changes: 1 addition & 1 deletion lib/style-transfer/style_transfer/rb_gen/steps/generate.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ def generate(
gc.collect()
torch.cuda.empty_cache()
logging.info("🫧 Building VLLM Pipeline ...")
llm = hydra.utils.instantiate(cfg.gen.llm, tensor_parallel_size=torch.cuda.device_count())
llm = hydra.utils.instantiate(cfg.gen.llm)

logging.info("πŸŽ‰ And it's done!")

Expand Down

0 comments on commit 37d42fe

Please sign in to comment.