Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blob data size exceeds limit. #2041

Open
victorlcm opened this issue Sep 4, 2024 · 2 comments
Open

Blob data size exceeds limit. #2041

victorlcm opened this issue Sep 4, 2024 · 2 comments

Comments

@victorlcm
Copy link

Hi, we’re experiencing this error when trying to set up a postgres mirror to s3:

ERROR:  unable to submit job: "status: Unknown, message: \"unable to start PeerFlow workflow: Blob data size exceeds limit.\", details: [], metadata: MetadataMap { headers: {\"content-type\": \"application/grpc\"} }"

we’ve investigated and it seems to be a temporal blob limitation of payloads bigger than 2mb. We have a database with approximately 50k tables, so this might be the issue. Do you know if there’s a way around this? Maybe storing this in an external location and accessing it in the temporal workflow?

Thanks!

@iskakaushik
Copy link
Contributor

@victorlcm -- this one is a tricky issue to solve and is definitely on the roadmap. I was thinking about using a reference for the table mapping and store the actual blob in catalog.

As a temporary workaround, would you be able to use multiple mirrors or is that not an option?

@victorlcm
Copy link
Author

Thanks, @iskakaushik, we'll try separating it into multiple mirrors and check it that works!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants