Skip to content

Commit

Permalink
Replace dict merge with unpacking for compatibility of 3.8 in vLLM wo…
Browse files Browse the repository at this point in the history
…rker (#2824)

Signed-off-by: rudeigerc <rudeigerc@gmail.com>
  • Loading branch information
rudeigerc authored Dec 24, 2023
1 parent 05755c2 commit 0cf2886
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion fastchat/serve/vllm_worker.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ async def generate_stream(self, params):
# Emit twice here to ensure a 'finish_reason' with empty content in the OpenAI API response.
# This aligns with the behavior of model_worker.
if request_output.finished:
yield (json.dumps(ret | {"finish_reason": None}) + "\0").encode()
yield (json.dumps({**ret, **{"finish_reason": None}}) + "\0").encode()
yield (json.dumps(ret) + "\0").encode()

async def generate(self, params):
Expand Down

0 comments on commit 0cf2886

Please sign in to comment.