-
Notifications
You must be signed in to change notification settings - Fork 13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unknown Error #15
Comments
Hi Hmmm it's been used to copy pretty large files (and also lots of files). Can you give me some stats on # and size of files? Also, can you copy/paste the exact command you used? It was clipped off from the screen capture. Thanks Ken |
Hi @kpfaulkner , Thanks for getting back to me so promptly! Sorry for clipping the image. Here's a better one: I'm trying to move something on the order of 30TB. Maximum file size is 200GB. A couple thousand files I think. |
Thanks for the updated screen shot. Ok, couple of things I'd suggest. Firstly, add the -d flag. This will tell it to cache the file locally instead of in memory (that's the part which is currently blowing up for you). If that works, great. But another thing I'd suggest (and maybe try this first) is to use the -blobcopy flag. This will copy directly from S3 to Azure without having to go via your machine, thus not using any of your bandwidth and will probably be a lot quicker for that volume of data. Please let me know how you get along with it. Cheers Ken |
Hi @kpfaulkner |
Correct, its either-or. Both would work but I'd try the -blobcopy flag first. |
Hi @kpfaulkner ! Progress was being made, I think I transferred some 300MB this time before it crashed, with the following error: This looks maybe like it reached max retries? What do you think? |
Hmmm can you give me more of that screen shot? (including params). Also, can you add in the -db flag (if not already). Thanks Ken On Mon, Nov 21, 2016 at 10:34 AM, Alexander Lenail <notifications@github.com
|
Hi @kpfaulkner Above the image that's shown is just pages and pages of
That said, the command that was run was this one: (found by using the up arrow to show the last command run) |
Any ideas? It looks like AzureCopy started transferring every single file at once as opposed to doing them sequentially. Maybe either end of the transfer freaked out when AzureCopy started simultaneously moving a couple thousand files? @kpfaulkner |
Hi Have attached a custom build with some extra debugging (and increased timeouts). The exception you're seeing is in Microsofts code and I can't figure out (yet) whats causing it, but will be interested to see what happens with this new version. Please let me know how it goes. |
The zip you sent me didn't have an azurecopy.exe.Config file, so I copied the one I had into the directory. If that was incorrect let me know. The transfer started, I'll let you know if/once it fails, @kpfaulkner. Thanks for all the support! |
Correct, it didn't have the config file (was thinking you could use the one you had). Sorry, probably should have mentioned that :)
|
Have created another build with some better features :) In your app.config can you add:
These values are options that we can tinker with connection timeouts. Please let me know if this is any use. Thanks |
Curiously, the command completed this time, but I don't think it properly copied the data. If the command completes with the |
Was that with the first or second binary I uploaded to this thread? If it was the latest binary, did you see the message "New Batch" appearing a bunch of times? |
It was the first binary. Should I try again with the latest one? @kpfaulkner |
Have you tried the latest version ( https://github.com/kpfaulkner/azurecopy/releases/tag/1.3.3 ) ? |
Hi Ken,
I'm hoping to use AzureCopy to move a fairly large volume of data from S3 to Azure Blobs. I haven't been able to get it to work, I run into this problem when it tries to copy the first file:
It seems like it manages to get the first 12MB across. Any idea what the problem might be here? Too long a filename? Too big a file?
Thanks,
--Alex
The text was updated successfully, but these errors were encountered: