We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make the following improvements in deriving the max length field for text generation.
This logic should be added
The text was updated successfully, but these errors were encountered:
cd43c47
This logic should be revisited when transformers==5.0.0 is released.
How to derive model_max_length/max_length/max_new_tokens is an open discussion item and likely will lead to txtai changes with transformers V5.
Sorry, something went wrong.
davidmezzetti
No branches or pull requests
Make the following improvements in deriving the max length field for text generation.
This logic should be added
The text was updated successfully, but these errors were encountered: