-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
413 when creating connection #45423
Comments
Hello @jeremy-thomas-roc can you share the complete error message/payload? |
Can yo check the Network tab -> Developer Tools? It can provide a more complete response from the request. |
Same here with update google analytics connection and multiple propety_ids. In request payload my syncCatalog -> stream contain 1481 streams, I thinks it's because each property ids create streams (cf #42464) |
I have the same issue. Error occurs if source DB contains many tables, 1123 in my case. This issue was reported many times I can connect to the docker container via
But I'm not sure how to edit and reload nginx conf. I see nginx in a processes list
but neither config or binary files exist on that path. It looks like it's running inside some container environment (container inside container!) Does anyone know how to restart nginx in Airbyte container? |
I've solved this problem, having reinstalled the Airbyte with docker-compose - https://docs.airbyte.com/deploying-airbyte/docker-compose |
Do you mean it worked out-of-the-box, or you were then able to edit the nginx.conf to get it to work correctly? |
It worked out-of-the-box. I didn't do any additional steps. |
adding annotation to
then add under metadata (50m = 50 mb, if you need more add higher number, etc..)... metadata:
annotations:
nginx.ingress.kubernetes.io/proxy-body-size: "50m" Should work momentarily.. Just a temporarily fix though, I don't know if this is best method, but works for me. |
@japerry911 your method is the correct method. Airbyte has also adjusted these limits in the close PR above. |
What happened?
When creating a connection between a Postgres source and Snowflake destination, I get a 413 error. I know this is not an adapter issue, because I have configured the same type of connection in this instance without issues. The only difference between the 2 connections is that I have added a replication slot and publication to the Postgres source.
I have also edited the nginx.conf file at /etc/nginx to have this line in the http section:
client_max_body_size 0;
, which did not change anything, even after restarting nginx.What did you expect to happen?
The connection to be created successfully
Abctl Version
Docker Version
OS Version
The text was updated successfully, but these errors were encountered: