File tree Expand file tree Collapse file tree 1 file changed +3
-3
lines changed 
doc/source/serve/tutorials Expand file tree Collapse file tree 1 file changed +3
-3
lines changed Original file line number Diff line number Diff line change @@ -13,9 +13,11 @@ This example shows how to deploy DeepSeek R1 or V3 with Ray Serve LLM.
1313To run this example, install the following:
1414
1515``` bash 
16- pip install " ray[llm]" 
16+ pip install " ray[llm]==2.46.0 " 
1717``` 
1818
19+ Note: Deploying DeepSeek-R1 requires at least 720GB of free disk space per worker node to store model weights.
20+ 
1921## Deployment  
2022
2123### Quick Deployment  
@@ -51,7 +53,6 @@ llm_config = LLMConfig(
5153        " max_model_len" 16384 ,
5254        " enable_chunked_prefill" True ,
5355        " enable_prefix_caching" True ,
54-         " trust_remote_code" True ,
5556    },
5657)
5758
@@ -89,7 +90,6 @@ applications:
8990          max_model_len : 16384 
9091          enable_chunked_prefill : true 
9192          enable_prefix_caching : true 
92-           trust_remote_code : true 
9393  import_path : ray.serve.llm:build_openai_app 
9494  name : llm_app 
9595  route_prefix : " /" 
 
 
   
 
     
   
   
          
    
    
     
    
      
     
     
    You can’t perform that action at this time.
  
 
    
  
    
      
        
     
       
      
     
   
 
    
    
  
 
  
 
     
    
0 commit comments