We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Here is an error that results in failed deserialization error:
{ "id": "...", "object": "vector_store.file", "usage_bytes": 0, "created_at": 1722298943, "vector_store_id": "...", "status": "failed", "last_error": { "code": "invalid_file", "message": "The file could not be parsed because it is too large." }, "chunking_strategy": { "type": "static", "static": { "max_chunk_size_tokens": 800, "chunk_overlap_tokens": 400 } } }
The text was updated successfully, but these errors were encountered:
Thanks for the issue, it seems like API has new changes which are not yet documented.
Do you mind reporting this on openai/openai-openapi#300 too?
Sorry, something went wrong.
I've done as you suggested. Meanwhile, an interesting fact. I did a tests run to check the fix in #252 and it came up with the following:
failures: ---- vector_store_files::tests::vector_store_file_creation_and_deletion stdout ---- Error: JSONDeserialize(Error("unknown variant `invalid_file`, expected one of `internal_error`, `file_not_found`, `parsing_error`, `unhandled_mime_type`", line: 9, column: 26)) failures: vector_store_files::tests::vector_store_file_creation_and_deletion
Though it happened on the first run only.
Fix is released in v0.24.0
No branches or pull requests
Here is an error that results in failed deserialization error:
The text was updated successfully, but these errors were encountered: