Errors using Llama models via the bedrock
endpoint.
#4841
Unanswered
cmurtaugh
asked this question in
Troubleshooting
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm trying to use a Llama model via Bedrock but keep getting an error:
I have this in my librechat.yaml:
but continue to get the same error.
If I nest
maxContextTokens
undermodel_options
like below (as the docs appear to direct) then themaxContextTokens
setting seems to be ignored entirely:How is this supposed to work?
Beta Was this translation helpful? Give feedback.
All reactions