Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Jupyter PySpark notebook / Uploading "big" file fails silently #463

Closed
tkumpumaki opened this issue Feb 18, 2016 · 3 comments
Closed

Jupyter PySpark notebook / Uploading "big" file fails silently #463

tkumpumaki opened this issue Feb 18, 2016 · 3 comments
Assignees
Labels

Comments

@tkumpumaki
Copy link

In the tab "Files" of Jupyter PySpark notebook. Trying to upload datafile of size 17MB fails silently. File of the size of 12 MB uploads properly.

Maybe there is a some limit on file size, but failing without any notice isn't very informative.

@tourunen
Copy link
Contributor

Thanks for reporting this! The same seems to happen with RStudio, so this is maybe a size limit in our ssl-terminating proxy.

@tourunen tourunen added the bug label Feb 19, 2016
@tourunen tourunen self-assigned this Feb 22, 2016
@tourunen
Copy link
Contributor

There is indeed 20M limit in frontend nginx config. Changing that to 128M makes RStudio uploads up to at least 100MB succeed. There is probably another limit in Jupyter itself, 48 MB upload is ok but 64MB is not.

edit: Related issues: jupyter/notebook#623 jupyter/notebook#96

@tourunen
Copy link
Contributor

tourunen commented Mar 4, 2016

With PR #468, PB proxy now accepts 1G uploads. Uploads in RStudio work now for large uploads with the latest master, but Jupyter Notebook still has the internal limits mentioned above. I'm closing this, reopen if necessary.

@tourunen tourunen closed this as completed Mar 4, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants