You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I'm getting `This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.):``
To Reproduce
Steps to reproduce the behavior.
load an URL example: https://vadb.org/scenes/cordoba,
the chunk is gt the accepted, it will warn and then fail: (pinecone store)
Created a chunk of size 210763, which is longer than the specified 1000
[ActiveJob] [Goai::WebsiteProcessorJob] [dd6a9549-dd34-4430-a510-fc2d4a2208f2] This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.
Expected behavior
chunks properly chunked
Terminal commands & output
WARN -- : Created a chunk of size 210763, which is longer than the specified 1000
[ActiveJob] This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.
RuntimeError (An error occurred: This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.):
Desktop (please complete the following information):
OS: [e.g. OS X, Linux, Ubuntu, Windows]
Ruby version 3.2
Langchain.rb version 0.13.0
The text was updated successfully, but these errors were encountered:
I got the same issue loading a text file via the following code. I tried it out in Ruby version 3.1 with gem version 0.14.0 client.add_data(paths: [my_long_file])
Describe the bug
I'm getting `This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.):``
To Reproduce
Steps to reproduce the behavior.
load an URL example: https://vadb.org/scenes/cordoba,
the chunk is gt the accepted, it will warn and then fail: (pinecone store)
Created a chunk of size 210763, which is longer than the specified 1000
[ActiveJob] [Goai::WebsiteProcessorJob] [dd6a9549-dd34-4430-a510-fc2d4a2208f2] This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.
Expected behavior
chunks properly chunked
Terminal commands & output
Desktop (please complete the following information):
The text was updated successfully, but these errors were encountered: