You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tried to push one of my large-ish repositories to test (2 million-ish files, about 8.5TB). It made it about 2TB in and the server (a Sun Fire X4140 w/ 12 CPU cores and 16GB of RAM) ran out of memory and the push was killed by the kernel. Nothing else was running on the system, and system logs show that all physical memory and swap was eaten up by python.
This was using the "add" mode.
The text was updated successfully, but these errors were encountered:
I have a similar issue - trying to upload a 100Gb file.
Process Process-2:
Traceback (most recent call last):
File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/home/ec2-user/s3-parallel-put", line 301, in putter
content = value.get_content()
File "/home/ec2-user/s3-parallel-put", line 109, in get_content
self.content = file_object.read()
MemoryError
Tried to push one of my large-ish repositories to test (2 million-ish files, about 8.5TB). It made it about 2TB in and the server (a Sun Fire X4140 w/ 12 CPU cores and 16GB of RAM) ran out of memory and the push was killed by the kernel. Nothing else was running on the system, and system logs show that all physical memory and swap was eaten up by python.
This was using the "add" mode.
The text was updated successfully, but these errors were encountered: