Skip to content
This repository has been archived by the owner on Sep 7, 2022. It is now read-only.

OutOfMemoryException upon sending a huge files. #39

Open
ChetanBhasin opened this issue Nov 1, 2014 · 3 comments
Open

OutOfMemoryException upon sending a huge files. #39

ChetanBhasin opened this issue Nov 1, 2014 · 3 comments

Comments

@ChetanBhasin
Copy link

Apart from the way used in examples, I have tried several other ways to send big files using smoke server but there is no way to do it correctly.
This is probably because when sending the raw data, the function only accepts an array and not a stream.

@ghouet
Copy link
Contributor

ghouet commented Nov 3, 2014

Smoke is not ready to stream big files yet. It has been a discussion among us but was never implemented. You are right though, it needs to support file streams so that it does not use all the memory. Feel free to submit a pull request, that's a change we would be happy to integrate.

@chrisdinn
Copy link
Contributor

The idea of a response with an OutputStreamWriter for the body is really attractive for this kind of use case. Right now, Smoke expects you to have the entire response ready when the Future completes.

As Gatean said: we would love to see a PR for this.

damienlevin pushed a commit to damienlevin/smoke that referenced this issue Apr 24, 2016
Brando will now notify when it cannot connects to redis
@harjitdotsingh
Copy link

Has this been resolved ?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants