Skip to content

Upload Memory Error #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Ryanb58 opened this issue Aug 16, 2016 · 3 comments
Closed

Upload Memory Error #11

Ryanb58 opened this issue Aug 16, 2016 · 3 comments

Comments

@Ryanb58
Copy link
Contributor

Ryanb58 commented Aug 16, 2016

When trying to upload a file that has more bytes than a string can hold, I get an OverflowError.

>>> import smartfile

>>> 
>>> 
>>> 
>>> api = smartfile.BasicClient('**********', '*****************')
>>> file = open('Downloads/Win10_1511_2_English_x64.iso', 'rb')
>>> api.upload('test.io', file)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/smartfile/__init__.py", line 145, in upload
    return self.post('/path/data/', file=arg)
  File "/usr/lib/python2.7/site-packages/smartfile/__init__.py", line 130, in post
    return self._request('post', endpoint, id=id, data=kwargs)
  File "/usr/lib/python2.7/site-packages/smartfile/__init__.py", line 109, in _request
    return self._do_request(request, url, **kwargs)
  File "/usr/lib/python2.7/site-packages/smartfile/__init__.py", line 206, in _do_request
    return super(BasicClient, self)._do_request(*args, **kwargs)
  File "/usr/lib/python2.7/site-packages/smartfile/__init__.py", line 52, in _do_request
    response = request(url, stream=True, **kwargs)
  File "/usr/lib/python2.7/site-packages/requests/api.py", line 111, in post
    return request('post', url, data=data, json=json, **kwargs)
  File "/usr/lib/python2.7/site-packages/requests/api.py", line 57, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 585, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 403, in send
    timeout=timeout
  File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 578, in urlopen
    chunked=chunked)
  File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 362, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib64/python2.7/httplib.py", line 1057, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib64/python2.7/httplib.py", line 1097, in _send_request
    self.endheaders(body)
  File "/usr/lib64/python2.7/httplib.py", line 1053, in endheaders
    self._send_output(message_body)
  File "/usr/lib64/python2.7/httplib.py", line 897, in _send_output
    self.send(msg)
  File "/usr/lib64/python2.7/httplib.py", line 873, in send
    self.sock.sendall(data)
  File "/usr/lib64/python2.7/ssl.py", line 721, in sendall
    v = self.send(data[count:])
  File "/usr/lib64/python2.7/ssl.py", line 687, in send
    v = self._sslobj.write(data)
OverflowError: string longer than 2147483647 bytes
string longer than 2147483647 bytes

I believe moving towards chunked uploading would be beneficial to looking past this problem.

@btimby
Copy link
Contributor

btimby commented Aug 16, 2016

Chunked uploading is not the best answer here. Requests can do bidirectional streaming of large files, THAT is the answer.

http://docs.python-requests.org/en/master/user/advanced/#streaming-uploads
http://stackoverflow.com/questions/16694907/how-to-download-large-file-in-python-with-requests-py

@Ryanb58
Copy link
Contributor Author

Ryanb58 commented Aug 18, 2016

I like @btimby 's answer. Although, I am curious to know if SmartFile's API accepts streamed uploads. We should investigate this further before trying to implement.

@travcunn
Copy link
Contributor

travcunn commented Aug 18, 2016

The idea of the streamed upload, is the file isn't read into memory on the client side before it is uploaded. Instead, it's read in small chunks. On the server side, it looks like one big upload.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants