Skip to content

Commit

Permalink
align HF tokenization with genai chat scenario in sample test
Browse files Browse the repository at this point in the history
  • Loading branch information
pavel-esir committed Sep 25, 2024
1 parent 94eb788 commit 870158a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion .github/workflows/causal_lm_cpp.yml
Original file line number Diff line number Diff line change
Expand Up @@ -665,7 +665,7 @@ jobs:
output.write('question:\n')
chat_history.append(gen_prompt(prompt))
chat_prompt = tokenizer.apply_chat_template(chat_history, tokenize=False, add_generation_prompt=True)
tokenized = tokenizer(chat_prompt, return_tensors='pt')
tokenized = tokenizer(chat_prompt, return_tensors='pt', add_special_tokens=True)
answer = model.generate(**tokenized, max_length=1000, do_sample=False)
answer_str = tokenizer.decode(answer[0, tokenized['input_ids'].numel():], skip_special_tokens=True)
chat_history.append(gen_answer(answer_str))
Expand Down

0 comments on commit 870158a

Please sign in to comment.