Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve default max length logic for text generation #364

Closed
davidmezzetti opened this issue Oct 17, 2022 · 1 comment
Closed

Improve default max length logic for text generation #364

davidmezzetti opened this issue Oct 17, 2022 · 1 comment
Assignees
Milestone

Comments

@davidmezzetti
Copy link
Member

Make the following improvements in deriving the max length field for text generation.

  • Use model.config.max_length if it's not the default value
  • Else use tokenizer.model_max_length if available

This logic should be added

@davidmezzetti davidmezzetti added this to the v5.1.0 milestone Oct 17, 2022
@davidmezzetti davidmezzetti self-assigned this Oct 17, 2022
@davidmezzetti
Copy link
Member Author

This logic should be revisited when transformers==5.0.0 is released.

How to derive model_max_length/max_length/max_new_tokens is an open discussion item and likely will lead to txtai changes with transformers V5.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant