Skip to content

Commit

Permalink
vllm: fix rolling prefix_token
Browse files Browse the repository at this point in the history
  • Loading branch information
baberabb committed Aug 29, 2024
1 parent 1146cf2 commit f2f7fa8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion lm_eval/models/vllm_causallms.py
Original file line number Diff line number Diff line change
Expand Up @@ -289,7 +289,7 @@ def loglikelihood_rolling(
make_disjoint_window,
get_rolling_token_windows(
token_list=self.tok_encode(string),
prefix_token=self.eot_token_id,
prefix_token=self.prefix_token_id,
max_seq_len=self.max_length - 1,
context_len=1,
),
Expand Down

0 comments on commit f2f7fa8

Please sign in to comment.