Fix llama sin_cached/cos_cached backward compatibility #29299
Closed
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The
_sin_cached
&_cos_cached
are never set in the init (compare to https://github.com/huggingface/transformers/blob/v4.37.2/src/transformers/models/llama/modeling_llama.py#L134-L136), which yields errors in external packages as backward compatibility is broken (e.g. in https://github.com/AutoGPTQ/AutoGPTQ/blob/6b55300dd83326504ee6e02b730fa4451adfa479/auto_gptq/modeling/_utils.py#L95-L96)IMO this should be in a patch release.