feat(max tokens): increase max tokens from 6k to 15k and add to config #29
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What was done?
This pull request introduces the ability for users to configure the maximum tokens for LLM input, increasing the limit from 6000 to 15000. This enhancement allows for greater flexibility in handling larger inputs for language model processing.
How was it done?
INPUT_MAX_TOKENS
which defaults to 15000.set_user_config
method to allow integer values to be set in the configuration.get_pr_data
function to check against the new maximum token limit when processing input.How was it tested?
INPUT_MAX_TOKENS
can be set and retrieved correctly.