Skip to content

Commit

Permalink
Fixing vllm build (#6433)
Browse files Browse the repository at this point in the history
* Fixing torch version for vllm
  • Loading branch information
oandreeva-nv authored Oct 16, 2023
1 parent 62db791 commit e11ae14
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion build.py
Original file line number Diff line number Diff line change
Expand Up @@ -1366,9 +1366,11 @@ def dockerfile_prepare_container_linux(argmap, backends, enable_gpu, target_mach
if "vllm" in backends:
# [DLIS-5606] Build Conda environment for vLLM backend
# Remove Pip install once vLLM backend moves to Conda environment.
# [DLIS-5650] Pre-installing torch 2.0.1, since vllm 0.2
# requires torch >= 2.0.0, but it doesn't work with torch 2.1.0.
df += """
# vLLM needed for vLLM backend
RUN pip3 install vllm=={}
RUN pip3 install torch==2.0.1 vllm=={}
""".format(
TRITON_VERSION_MAP[FLAGS.version][7]
)
Expand Down

0 comments on commit e11ae14

Please sign in to comment.