-
Notifications
You must be signed in to change notification settings - Fork 491
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gotcha: Max 100 items per batch!?! #1057
Comments
FYI: While testing, I was able to create ~8400 documents (empty-ish) with a single call to a stored procedure on a collection with only 400 RU/s allocated. Results are variable, of course. |
@abhijitpai Do we have the differences between Stored Procedures and Batch documented somewhere? |
We have the transactional batch limit now documented at https://docs.microsoft.com/en-us/azure/cosmos-db/concepts-limits#per-request-limits. We will look at increasing this over time but as discussed in #1059 the limit allows for predictability. |
I tried to create 101 documents in a batch and received a
400 (Bad Request)
result from the server with the errorBatch request has more operations than what is supported.
Creating 100 documents works fine. Clearly, there is a limit of 100 operations per batch.
This is consistent with the table storage API. But ...really... who's ever heard of that, even :D
There's no documentation I could find anywhere which says the SQL API has the same limitations. The only hint I could find was this issue over here: Azure/azure-documentdb-datamigrationtool#39
I suggest:
...I also suggest removing the 100 doc and 2mb payload limit altogether, but... ...yeah, good luck with that :D
The text was updated successfully, but these errors were encountered: