Skip to content

Commit

Permalink
FIX Change check if past_key_values is empty
Browse files Browse the repository at this point in the history
After transformers merged this PR:

huggingface/transformers#33703

The bool of past_key_values (a Cache instance) would change from False
to True in one of our checks. Use get_seq_length() method instead, which
is consistent before and after that commit.

I checked the tests with the new change for both transformers before and
after that commit and they passed, so this change should be backwards
compatible.
  • Loading branch information
BenjaminBossan committed Sep 27, 2024
1 parent ccc3501 commit 8777e8a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/peft/peft_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -1776,7 +1776,7 @@ def prepare_inputs_for_generation(self, *args, task_ids: Optional[torch.Tensor]

# no past_key_values or past_key_values empty cache
requires_prompt_injection = (model_kwargs["past_key_values"] is None) or (
isinstance(model_kwargs["past_key_values"], transformers.Cache) and not model_kwargs["past_key_values"]
isinstance(model_kwargs["past_key_values"], transformers.Cache) and not model_kwargs["past_key_values"].get_seq_length()
)

if requires_prompt_injection and peft_config.peft_type == PeftType.PREFIX_TUNING:
Expand Down

0 comments on commit 8777e8a

Please sign in to comment.