You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, we’re experiencing this error when trying to set up a postgres mirror to s3:
ERROR: unable to submit job: "status: Unknown, message: \"unable to start PeerFlow workflow: Blob data size exceeds limit.\", details: [], metadata: MetadataMap { headers: {\"content-type\": \"application/grpc\"} }"
we’ve investigated and it seems to be a temporal blob limitation of payloads bigger than 2mb. We have a database with approximately 50k tables, so this might be the issue. Do you know if there’s a way around this? Maybe storing this in an external location and accessing it in the temporal workflow?
Thanks!
The text was updated successfully, but these errors were encountered:
@victorlcm -- this one is a tricky issue to solve and is definitely on the roadmap. I was thinking about using a reference for the table mapping and store the actual blob in catalog.
As a temporary workaround, would you be able to use multiple mirrors or is that not an option?
Hi, we’re experiencing this error when trying to set up a postgres mirror to s3:
we’ve investigated and it seems to be a temporal blob limitation of payloads bigger than 2mb. We have a database with approximately 50k tables, so this might be the issue. Do you know if there’s a way around this? Maybe storing this in an external location and accessing it in the temporal workflow?
Thanks!
The text was updated successfully, but these errors were encountered: