Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

413 Request Entity Too Large nginx/1.23.2 for Airbyte 0.40.15 & above versions #18822

Closed
CoCoStudyStudy opened this issue Nov 1, 2022 · 9 comments · Fixed by #20383
Closed
Labels
area/platform issues related to the platform community team/prod-eng type/bug Something isn't working

Comments

@CoCoStudyStudy
Copy link

Environment

  • Airbyte version: 0.40.15 & 0.40.17
  • OS Version / Instance: macOS, AWS EC2
  • Deployment: example are Docker deploy env
  • Source Connector and version: MSSQL 0.4.22
  • Destination Connector and version: SNOWFLAKE 0.4.38
  • Step where error happened: Setup new connection

Current Behavior

Tell us what happens.
We deployed Airbyte OSS on AWS EC2 and Airbyte OSS on macOS and upgraded EC2 instance to 0.40.15 and macOS instance to 0.40.17. After upgrade, both have the error when building the connection. New streams can't be saved when rebuilding the connections.
Error:

413 Request Entity Too Large


nginx/1.23.2

Expected Behavior

The issue didn't occur with the older version. It appeared once we upgraded to the recent version.
We expect the new streams can be saved and the connections can be built.

Steps to Reproduce

1.Replication Tab Refresh Source Schema
2.Add new stream
3.Save changes
4. Changes can not be saved due to the Error: non-json response, in inspect mode, Error: Status Code: 413 Request Entity Too Large

##Finding
We looked into the proxy server, guess it's caused by the default file size set by nginx server.

@CoCoStudyStudy CoCoStudyStudy added needs-triage type/bug Something isn't working labels Nov 1, 2022
@natalyjazzviolin natalyjazzviolin added area/platform issues related to the platform team/prod-eng and removed needs-triage team/tse Technical Support Engineers autoteam labels Nov 2, 2022
@vincentkoc
Copy link
Contributor

We found the same issue with our nginx, and feel like it can be resolved by increasing client_max_body_size in nginx

@natalyjazzviolin
Copy link
Contributor

@koconder did you apply the fix/did it work for you?

@vincentkoc
Copy link
Contributor

Nope not yet, but its part of the default settings and should be replicated in kubernetes

client_max_body_size 200M;

@vincentkoc
Copy link
Contributor

Also has been mentioned here before #4086

@mikemlg
Copy link

mikemlg commented Nov 23, 2022

so when will this issue be resolved. It will happen if you have a source with many tables such salesforce?

@oleg-savko
Copy link

Have the same problem, trying to increase property in the following way but it not help:

 docker exec -it airbyte-webapp sh
 / # vi /etc/nginx/conf.d/default.conf
 change:
  client_max_body_size 200M; 
 to:
   client_max_body_size 12000M;  
 nginx -s reload

Any news about when or how it can be resolved?

@mijns
Copy link

mijns commented Nov 30, 2022

This is because airbyte-proxy does not have the client_max_body_size set so it's defaulting to 1M and causing this problem. Confirmed by adding client_max_body_size 200M; to airbyte-proxy's /etc/nginx/nginx.conf.

@andresbravog
Copy link
Contributor

Working on a PR, the issue is on airbyte-proxy as @mijonmustard suggested.

In the meantime you can:

docker-compose exec airbyte-proxy bash
> apt-get update
> apt-get install vim
> vim /etc/nginx/templates/nginx-auth.conf.template
# Add client_max_body_size 200M; to all `/`
> vim /etc/nginx/templates/nginx-no-auth.conf.template
# Add client_max_body_size 200M; to all `/`
> exit

docker-compose restart airbyte-proxy

@andresbravog
Copy link
Contributor

PR provided

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/platform issues related to the platform community team/prod-eng type/bug Something isn't working
Projects
None yet
8 participants