Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unbounded memory usage #46

Open
decimalator opened this issue Aug 30, 2017 · 2 comments
Open

Unbounded memory usage #46

decimalator opened this issue Aug 30, 2017 · 2 comments

Comments

@decimalator
Copy link

Tried to push one of my large-ish repositories to test (2 million-ish files, about 8.5TB). It made it about 2TB in and the server (a Sun Fire X4140 w/ 12 CPU cores and 16GB of RAM) ran out of memory and the push was killed by the kernel. Nothing else was running on the system, and system logs show that all physical memory and swap was eaten up by python.

This was using the "add" mode.

@mishudark
Copy link
Owner

Hi, do you have any output?

@stevemac007
Copy link

I have a similar issue - trying to upload a 100Gb file.

Process Process-2:
Traceback (most recent call last):
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/home/ec2-user/s3-parallel-put", line 301, in putter
    content = value.get_content()
  File "/home/ec2-user/s3-parallel-put", line 109, in get_content
    self.content = file_object.read()
MemoryError

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants