Skip to content

Commit

Permalink
no max batch tokens
Browse files Browse the repository at this point in the history
  • Loading branch information
AlexPiche committed Dec 22, 2024
1 parent 06876e2 commit 210a933
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion conf/rl_gsm8k.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,6 @@ vllm_config:
--gpu-memory-utilization: 0.9
# VLLM get log probs OOM https://github.com/vllm-project/vllm/issues/5907
--enable-chunked-prefill: ""
--max-num-batched-tokens: 256
--enable-prefix-caching: ""
ref_vllm_kwargs:
--download-dir: /mnt/llmd/base_models/
Expand Down

0 comments on commit 210a933

Please sign in to comment.