-
Notifications
You must be signed in to change notification settings - Fork 553
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
resuming mc cp session seems to go slower (and use less parallelism) than original cp #2259
Comments
|
@varung checking back to see if you saw our previous comment? |
Hi
I thought that when you use -r, mc cp *would* upload in parallel. I
certainly saw multiple processes, but maybe I was imagining that?
If it doesn't, I would really like it if it did. So I don't have to wrap it
in some find | xargs -P stuff. Or just use the go library myself.
Finally, having to manually re-resume sessions is a really annoying. Would
be nice if somehow that could be avoided. My goal when I do mc cp -r is
just to get my directory into s3 as fast and reliably as possible.
…On Thu, Sep 28, 2017 at 11:59 AM Harshavardhana ***@***.***> wrote:
mc cp doesn't upload in parallel, i am not sure what you are implying.
Also i am not able to reproduce this on my end. Can you provide us more
details? like mc version, operating system etc.
@varung <https://github.com/varung> checking back to see if you saw our
previous comment?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#2259 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AATq0h5V0tXSNqr_-WmnR3bKEv3nnox2ks5sm-yCgaJpZM4Ph3gi>
.
|
-r being recursive upload.
…On Thu, Sep 28, 2017 at 12:02 PM Varun Ganapathi ***@***.***> wrote:
Hi
I thought that when you use -r, mc cp *would* upload in parallel. I
certainly saw multiple processes, but maybe I was imagining that?
If it doesn't, I would really like it if it did. So I don't have to wrap
it in some find | xargs -P stuff. Or just use the go library myself.
Finally, having to manually re-resume sessions is a really annoying. Would
be nice if somehow that could be avoided. My goal when I do mc cp -r is
just to get my directory into s3 as fast and reliably as possible.
On Thu, Sep 28, 2017 at 11:59 AM Harshavardhana ***@***.***>
wrote:
> mc cp doesn't upload in parallel, i am not sure what you are implying.
> Also i am not able to reproduce this on my end. Can you provide us more
> details? like mc version, operating system etc.
>
> @varung <https://github.com/varung> checking back to see if you saw our
> previous comment?
>
> —
> You are receiving this because you were mentioned.
>
>
> Reply to this email directly, view it on GitHub
> <#2259 (comment)>, or mute
> the thread
> <https://github.com/notifications/unsubscribe-auth/AATq0h5V0tXSNqr_-WmnR3bKEv3nnox2ks5sm-yCgaJpZM4Ph3gi>
> .
>
|
Closing this as fixed in master and released. @varung feel free re-open if you have further issues. |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
I initiated an mc cp --recursive
copying goes at 50mb/s, and top reveals many mc processes. Great!
copy failed due to server 504
mc session resume {ID}
copy now goes slow (like 7mb/s, and top/ps reveals only one mc session resume process)
This seems surprising.
The text was updated successfully, but these errors were encountered: