Skip to content

Commit

Permalink
[Bugfix] Fix token padding for chameleon (vllm-project#6724)
Browse files Browse the repository at this point in the history
Signed-off-by: Alvant <alvasian@yandex.ru>
  • Loading branch information
ywang96 authored and Alvant committed Oct 26, 2024
1 parent c4d1c1b commit 5b1042f
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion vllm/model_executor/models/chameleon.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,7 +125,8 @@ def input_processor_for_chameleon(ctx: InputContext, llm_inputs: LLMInputs):

# Appending sep token for chat mode to follow default processor
# behavior
new_prompt += tokenizer.sep_token
if new_prompt is not None:
new_prompt += tokenizer.sep_token
new_token_ids += [CHAMELEON_SEP_TOKEN_ID]

# NOTE: Create a defensive copy of the original inputs
Expand Down

0 comments on commit 5b1042f

Please sign in to comment.