We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
_get_config
Cache kwargs max_batch_size alr deprecated since HF ver 4.46.1 https://github.com/huggingface/transformers/blob/v4.46.1/src/transformers/cache_utils.py#L319
max_batch_size
4.46.1
This Is related to this part of the faulty code:
parler-tts/parler_tts/modeling_parler_tts.py
Lines 3290 to 3297 in d108732
when testing on this snippet: https://github.com/huggingface/parler-tts/blob/main/INFERENCE.md#compilation
Only need to fix the name from max_batch_size to batch_size, can open the PR too if needed
batch_size
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
Cache kwargs
max_batch_size
alr deprecated since HF ver4.46.1
https://github.com/huggingface/transformers/blob/v4.46.1/src/transformers/cache_utils.py#L319
This Is related to this part of the faulty code:
parler-tts/parler_tts/modeling_parler_tts.py
Lines 3290 to 3297 in d108732
when testing on this snippet:
https://github.com/huggingface/parler-tts/blob/main/INFERENCE.md#compilation
Only need to fix the name from
max_batch_size
tobatch_size
, can open the PR too if neededThe text was updated successfully, but these errors were encountered: