Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use multipart copy on files with certain characters #933

Closed
lonormaly opened this issue Mar 16, 2016 · 25 comments
Closed

Can't use multipart copy on files with certain characters #933

lonormaly opened this issue Mar 16, 2016 · 25 comments

Comments

@lonormaly
Copy link

Hello,

The MultipartCopy function doesn't seem to work on file that contains the following character: é

For example: BeyoncéIMG_18022016_111346.png
Same function run without the special characters works fine.

The response I get is:

2016-03-16 12:55:32 Info: initiateMultipartCopy::ArtworkId 1792 Multipart copy started #7
2016-03-16 12:55:33 Info: initiateMultipartCopy:: Uploading chunk {1}
2016-03-16 12:55:33 Info: initiateMultipartCopy::ArtworkId 1792 Multipart exception An exception occurred while uploading parts to a multipart upload. The following parts had errors:

- Part 1: Error executing "UploadPartCopy" on "https://s3.amazonaws.com/<bucket>/Beyonce%CC%81IMG_18022016_111346.png?partNumber=1&uploadId=<< upload id >>"; AWS HTTP error: Client error: 403 SignatureDoesNotMatch (client): The request signature we calculated does not match the signature you provided. Check your key and signing method. - <Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId><< AWS key >></AWSAccessKeyId><StringToSign>AWS4-HMAC-SHA256
20160316T115533Z
20160316/us-east-1/s3/aws4_request
9fcdd9dfc5d588870d39159ecccc654624c75ebfe19a4eb9501f08d799e64c3c</StringToSign><SignatureProvided><< signature >></SignatureProvided><StringToSignBytes><< bytes >></StringToSignBytes><CanonicalRequest>PUT
/<bucket>/Beyonce%CC%81IMG_18022016_111346.png
partNumber=1&amp;uploadId=<< upload ID >>
host:s3.amazonaws.com
x-amz-content-sha256:<< hash >>
x-amz-copy-source:/niio.temp.media.files/<source bucket>/Beyonce??IMG_18022016_111346.png
x-amz-copy-source-range:bytes=0-120143
x-amz-date:20160316T115533Z

host;x-amz-content-sha256;x-amz-copy-source;x-amz-copy-source-range;x-amz-date
e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855</CanonicalRequest><CanonicalRequestBytes><< bytes >></CanonicalRequestBytes><RequestId><< request ID >></RequestId><HostId><< host ID >></HostId></Error>

Thanks,
Shai

@jeskew
Copy link
Contributor

jeskew commented Mar 16, 2016

What character encoding was used to create the path string? The x-amz-copy-source header uses rawurlencode to convert characters in an object key to URL-safe characters, and it's being rendered in the request as Beyonce??IMG_18022016_111346.png.

@jeskew jeskew added the response-requested Waiting on additional info and feedback. Will move to "closing-soon" in 7 days. label Mar 17, 2016
@lonormaly
Copy link
Author

I'm not sure about the encoding, currently I have no control over the name of the files

@jeskew
Copy link
Contributor

jeskew commented Mar 23, 2016

Can you check if 'Beyonce%CC%81IMG_18022016_111346.png' === rawurlencode(urldecode('Beyonce%CC%81IMG_18022016_111346.png')) on the affected system? I'm unable to reproduce locally, but that may be because I have the iconv and mbstring extensions installed.

@jeskew
Copy link
Contributor

jeskew commented Mar 30, 2016

@lonormaly Were you able to test out the affected environment using the code in my previous comment?

@jeskew
Copy link
Contributor

jeskew commented Apr 4, 2016

I'm going to go ahead and close this issue, as it appears to be related to the system running PHP and not to the SDK. Please feel free to reopen if you have any questions or concerns.

@jeskew jeskew closed this as completed Apr 4, 2016
@lonormaly
Copy link
Author

Hey, sorry for not responding.

Unfortunately this issue is being reproduced time and time again.

File name is Beyoncé.jpg Other files the function works perfectly...

Here is agist of our function and exception logs:

https://gist.github.com/lonormaly/60b5fb5a71f8e27194e142f810941b82

Thank you!

@jeskew
Copy link
Contributor

jeskew commented Apr 21, 2016

This still looks like an environment-specific error. Were you able to check your system as described above? Does running 'Beyonce%CC%81IMG_18022016_111346.png' === rawurlencode(urldecode('Beyonce%CC%81IMG_18022016_111346.png')) return true?

@lonormaly
Copy link
Author

true

@jeskew
Copy link
Contributor

jeskew commented Apr 21, 2016

What output do you get from running php -r "var_dump(get_loaded_extensions());"'?

@lonormaly
Copy link
Author

array(59) {
[0]=>
string(4) "Core"
[1]=>
string(4) "date"
[2]=>
string(4) "ereg"
[3]=>
string(6) "libxml"
[4]=>
string(7) "openssl"
[5]=>
string(4) "pcre"
[6]=>
string(4) "zlib"
[7]=>
string(6) "bcmath"
[8]=>
string(3) "bz2"
[9]=>
string(8) "calendar"
[10]=>
string(5) "ctype"
[11]=>
string(3) "dba"
[12]=>
string(3) "dom"
[13]=>
string(4) "hash"
[14]=>
string(8) "fileinfo"
[15]=>
string(6) "filter"
[16]=>
string(3) "ftp"
[17]=>
string(7) "gettext"
[18]=>
string(3) "SPL"
[19]=>
string(5) "iconv"
[20]=>
string(8) "mbstring"
[21]=>
string(5) "pcntl"
[22]=>
string(7) "session"
[23]=>
string(5) "posix"
[24]=>
string(10) "Reflection"
[25]=>
string(8) "standard"
[26]=>
string(5) "shmop"
[27]=>
string(9) "SimpleXML"
[28]=>
string(4) "soap"
[29]=>
string(7) "sockets"
[30]=>
string(4) "Phar"
[31]=>
string(4) "exif"
[32]=>
string(7) "sysvmsg"
[33]=>
string(7) "sysvsem"
[34]=>
string(7) "sysvshm"
[35]=>
string(9) "tokenizer"
[36]=>
string(4) "wddx"
[37]=>
string(3) "xml"
[38]=>
string(9) "xmlreader"
[39]=>
string(9) "xmlwriter"
[40]=>
string(3) "zip"
[41]=>
string(3) "PDO"
[42]=>
string(4) "curl"
[43]=>
string(2) "gd"
[44]=>
string(4) "json"
[45]=>
string(6) "mcrypt"
[46]=>
string(8) "memcache"
[47]=>
string(5) "mysql"
[48]=>
string(6) "mysqli"
[49]=>
string(9) "pdo_mysql"
[50]=>
string(10) "pdo_sqlite"
[51]=>
string(8) "readline"
[52]=>
string(4) "sasl"
[53]=>
string(7) "sqlite3"
[54]=>
string(6) "xmlrpc"
[55]=>
string(3) "xsl"
[56]=>
string(8) "newrelic"
[57]=>
string(5) "mhash"
[58]=>
string(12) "Zend OPcache"
}

@jeskew
Copy link
Contributor

jeskew commented Apr 21, 2016

I'm still unable to reproduce. Copying a file with the exact name provided works fine for me. Is there a particular version of PHP you're seeing this error on?

@lonormaly
Copy link
Author

Nope. Did you use the function in the gist to copy the file?

On Friday, 22 April 2016, Jonathan Eskew notifications@github.com wrote:

I'm still unable to reproduce. Copying a file with the exact name provided
works fine for me. Is there a particular version of PHP you're seeing this
error on?


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#933 (comment)

Thanks
Shai

@jeskew
Copy link
Contributor

jeskew commented Apr 21, 2016

Sorry, what I meant to ask was: which version of PHP are you using on the system where the error is appearing? I am unable to reproduce on PHP 5.6.20 or PHP 7.0.5.

@lonormaly
Copy link
Author

PHP 5.5.9-1ubuntu4.14 (cli) (built: Oct 28 2015 01:34:46)
Copyright (c) 1997-2014 The PHP Group
Zend Engine v2.5.0, Copyright (c) 1998-2014 Zend Technologies
with Zend OPcache v7.0.3, Copyright (c) 1999-2014, by Zend Technologies

@jeskew
Copy link
Contributor

jeskew commented Apr 22, 2016

I'm unable to reproduce on PHP 5.5, either. The root cause of the issue is somewhere in how PHP is configured on your system, which is causing the character encoding of the string provided to not be understood by rawurlencode. That is a native PHP function, so there really isn't anything the SDK can do about it.

@lonormaly
Copy link
Author

lonormaly commented Apr 23, 2016

We're not doing anything special in the PHP installation.
please try to copy a file named "Beyoncé.jpg" using the specific gist function I attached and let me know wether you're able to copy the file or not from one bucket to another.
We really need this function to work. We have received a 'true' for the comparison you asked on.
We're using SDK version 3.9.2

I'm pretty sure it has to do with a. the way we use the API or b. a bug with the SDK.

Really appreciate the efforts!

@jeskew
Copy link
Contributor

jeskew commented Apr 23, 2016

Ah, I see that you're directly using the multipart copier. You will need to URL encode the source key. This is something that Aws\S3\S3Client::copy or an instance of Aws\S3\ObjectCopier would take care of for you.

The multipart copier and uploader take the source as a string and rely on the user (or a higher level of abstraction in the SDK) to take care of encoding. Altering this interface would be a breaking change.

@lonormaly
Copy link
Author

It will be really helpful if you could provide the exact code I need to
alter, if you don't mind. Thanks so much

On Sat, Apr 23, 2016 at 6:18 PM, Jonathan Eskew notifications@github.com
wrote:

Ah, I see that you're directly using the multipart copier. You will need
to URL encode the source key. This is something that Aws\S3\S3Client::copy
or an instance of Aws\S3\ObjectCopier would take care of for you.

The multipart copier and uploader take the source as a string and rely on
the user (or a higher level of abstraction in the SDK) to take care of
encoding. Altering this interface would be a breaking change.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#933 (comment)

Thanks
Shai

@jeskew
Copy link
Contributor

jeskew commented Apr 23, 2016

The portion of the $tempSource variable that corresponds to the S3 key of the original source should URL encoded using rawurlencode. If you would prefer to have this done on your behalf, you could use Aws\S3\S3Client::copy.

@lonormaly
Copy link
Author

Thanks for that, wrapping the $tempSource with rawurlencode returns error from the MultipartCopy function:

screen shot 2016-04-24 at 2 23 05 pm

screen shot 2016-04-24 at 2 22 54 pm

We really need the Multipart capabilities, does the ordinary copy implements the multipart issue as well? We need to support really big files (Up to 5TB)

@jeskew
Copy link
Contributor

jeskew commented Apr 24, 2016

The source needs to be of the form <bucket>/<url-encoded key>. From the error you posted, it appears that you are URL encoding the entire parameter.

copy inspects the source and determines whether to use a single or multipart copy. It also determines what part size to use.

@lonormaly
Copy link
Author

We tried to wrap only the filename with rawurlencode and we tried to wrap the key part as you suggested and we experienced no success...

  1. When we tried to encode only the filename we got this error:
    2016-04-25 12:43:39 Error: [Aws\S3\Exception\S3Exception] Error executing "HeadObject" on "https://s3.amazonaws.com//Local/Artworks/Artwork1812/us-east-1%3Add906849-5f27-4ca4-bbf7-56fbacda93c3/Beyonce%25CC%2581.jpg"; AWS HTTP error: Client error: 404 NotFound (client): 404 Not Found (Request-ID: BF7FEDF0952042E6)
    Stack Trace:
    #0 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/WrappedHttpHandler.php(76): Aws\WrappedHttpHandler->parseError(Array, Object(GuzzleHttp\Psr7\Request), Object(Aws\Command))
    Added a ToC and made minor textual tweaks. #1 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(199): Aws\WrappedHttpHandler->Aws{closure}(Array)
    README installation instructions requires 2.* #2 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(170): GuzzleHttp\Promise\Promise::callHandler(2, Array, Array)
    Fixed typos #3 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/RejectedPromise.php(40): GuzzleHttp\Promise\Promise::GuzzleHttp\Promise{closure}(Array)
    Added documentation for methods where there had previously been an unresolvable {@inheritdoc} tag. #4 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/TaskQueue.php(60): GuzzleHttp\Promise\RejectedPromise::GuzzleHttp\Promise{closure}()
    Using multiple versions of AWS SDK for PHP in single application  #5 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(96): GuzzleHttp\Promise\TaskQueue->run()
    Amazon Glacier Documentation Update #6 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(123): GuzzleHttp\Handler\CurlMultiHandler->tick()
    What parameters does S3Client::putObject() expect? #7 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(240): GuzzleHttp\Handler\CurlMultiHandler->execute(true)
    fatal error when calling S3Client->createPresignedUrl() #8 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(217): GuzzleHttp\Promise\Promise->invokeWaitFn()
    Full example for uploading an object to S3 #9 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(261): GuzzleHttp\Promise\Promise->waitIfPending()
    Best way to upload larger files (i.e. podcasts)? #10 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(219): GuzzleHttp\Promise\Promise->invokeWaitList()
    Updated the Vanity configuration #11 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(261): GuzzleHttp\Promise\Promise->waitIfPending()
    require aws.phar fails silently #12 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(219): GuzzleHttp\Promise\Promise->invokeWaitList()
    Bug PHP Late Static Binding on 5.3.2 #13 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(62): GuzzleHttp\Promise\Promise->waitIfPending()
    Migrating session handler to v2 #14 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/AwsClient.php(202): GuzzleHttp\Promise\Promise->wait()
    Using PHP SDK2 for SQS #15 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/AwsClient.php(167): Aws\AwsClient->execute(Object(Aws\Command))
    S3 Multipart Upload #16 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(165): Aws\AwsClient->__call('headObject', Array)
    uploadPart response appears to be double quoting ETag #17 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(165): Aws\S3\S3Client->headObject(Array)
    Problem with buckets in US Standard region #18 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(140): Aws\S3\MultipartCopy->fetchSourceMetadata()
    UploadBuilder example does not work for me #19 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(134): Aws\S3\MultipartCopy->getSourceMetadata()
    Latest version (2.0.2) from PEAR breaks .phar package + location #20 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartUploadingTrait.php(75): Aws\S3\MultipartCopy->getSourceSize()
    Update src/Aws/S3/Model/MultipartUpload/UploadBuilder.php #21 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/Multipart/AbstractUploadManager.php(220): Aws\S3\MultipartCopy->determinePartSize()
    Refactored iterators to remove code duplication and ease creation #22 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/Multipart/AbstractUploadManager.php(60): Aws\Multipart\AbstractUploadManager->determineState()
    Uploading an Empty File #23 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(57): Aws\Multipart\AbstractUploadManager->__construct(Object(Aws\S3\S3Client), Array)
  2. When we tried to encode the whole key, without the bucket, we got the same error:
    2016-04-25 12:46:04 Error: [Aws\S3\Exception\S3Exception] Error executing "HeadObject" on "https://s3.amazonaws.com//Local%252FArtworks%252FArtwork1813%252Fus-east-1%253Add906849-5f27-4ca4-bbf7-56fbacda93c3%252FBeyonce%25CC%2581.jpg"; AWS HTTP error: Client error: 404 NotFound (client): 404 Not Found (Request-ID: E3700F6340C48952)

Stack Trace:
#0 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/WrappedHttpHandler.php(76): Aws\WrappedHttpHandler->parseError(Array, Object(GuzzleHttp\Psr7\Request), Object(Aws\Command))
#1 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(199): Aws\WrappedHttpHandler->Aws{closure}(Array)
#2 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(170): GuzzleHttp\Promise\Promise::callHandler(2, Array, Array)
#3 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/RejectedPromise.php(40): GuzzleHttp\Promise\Promise::GuzzleHttp\Promise{closure}(Array)
#4 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/TaskQueue.php(60): GuzzleHttp\Promise\RejectedPromise::GuzzleHttp\Promise{closure}()
#5 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(96): GuzzleHttp\Promise\TaskQueue->run()
#6 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php(123): GuzzleHttp\Handler\CurlMultiHandler->tick()
#7 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(240): GuzzleHttp\Handler\CurlMultiHandler->execute(true)
#8 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(217): GuzzleHttp\Promise\Promise->invokeWaitFn()
#9 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(261): GuzzleHttp\Promise\Promise->waitIfPending()
#10 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(219): GuzzleHttp\Promise\Promise->invokeWaitList()
#11 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(261): GuzzleHttp\Promise\Promise->waitIfPending()
#12 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(219): GuzzleHttp\Promise\Promise->invokeWaitList()
#13 /Users/shaisnir/Development/Niio/Server/app/vendor/guzzlehttp/promises/src/Promise.php(62): GuzzleHttp\Promise\Promise->waitIfPending()
#14 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/AwsClient.php(202): GuzzleHttp\Promise\Promise->wait()
#15 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/AwsClient.php(167): Aws\AwsClient->execute(Object(Aws\Command))
#16 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(165): Aws\AwsClient->__call('headObject', Array)
#17 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(165): Aws\S3\S3Client->headObject(Array)
#18 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(140): Aws\S3\MultipartCopy->fetchSourceMetadata()
#19 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(134): Aws\S3\MultipartCopy->getSourceMetadata()
#20 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartUploadingTrait.php(75): Aws\S3\MultipartCopy->getSourceSize()
#21 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/Multipart/AbstractUploadManager.php(220): Aws\S3\MultipartCopy->determinePartSize()
#22 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/Multipart/AbstractUploadManager.php(60): Aws\Multipart\AbstractUploadManager->determineState()
#23 /Users/shaisnir/Development/Niio/Server/app/vendor/aws/aws-sdk-php/src/S3/MultipartCopy.php(57): Aws\Multipart\AbstractUploadManager->__construct(Object(Aws\S3\S3Client), Array)
#24 /Users/shaisnir/Development/Niio/Server/app/Controller/AppController.php(358): Aws\S3\MultipartCopy->__construct(Object(Aws\S3\S3Client), '/niio.temp.medi...', Array)

  1. The main reason why we didn't chose Copy over MultipartCopy is its ability to retry and resume and retain its progress, is that possible with the regular copy?

@jeskew
Copy link
Contributor

jeskew commented Apr 25, 2016

copy takes care of URL encoding. You should not encode the key if you're using the copy method, only if you're directly creating a multipart copy object.

@lonormaly
Copy link
Author

But we can't use multipart copy with the encoding solution you suggested. Using copy will demand a major change in the flow and it's a sensitive part on our code, also the copy function doesn't support resume from same place.
Do you have some suggestion on how to properly use the encode solution you suggested? Why does MultipartCopy doesn't support encoding just like the Copy function?

@jeskew
Copy link
Contributor

jeskew commented Apr 25, 2016

MultipartCopy is meant to be a low-level abstraction over the different requests that make up a multipart copy. It assumes that the copy source passed to its constructor is the one you mean to send to S3. If the MultipartCopy constructor suddenly started URL encoding portions of the source path, then any code passing in a properly encoded path would break. For example, the string 'Beyonce%CC%81IMG_18022016_111346.png' would be encoded as 'Beyonce%25CC%2581IMG_18022016_111346.png,' as the percent sign is a character that needs to be URL encoded.

If you're getting the path in the form of <bucket>/<key>, then you can encode the key portion of the path in this manner:

list($bucket, $key) = explode('/', $path, 2);
$path = $bucket . '/' . rawurlencode($key);

If the path also contains a version ID, you would also need to separate the query string from the key.

copy accepts the key, bucket, and version ID as separate parameters, which is why it's able to encode everything on your behalf. You can still catch a multipart exception when using copy, just as you were doing with the low-level MultipartCopy object:

try {
     $response = $s3->copy($sourceBucket, $sourceKey, $destinationBucket, $destinationKey, $acl);
} catch (\Aws\S3\Exception\S3Exception $e) {
    // This exception is thrown if a single-part copy fails
} catch (\Aws\Exception\MultipartUploadException $e) {
    // This exception is thrown if a multipart copy fails
    $response = $s3->copy($sourceBucket, $sourceKey, $destinationBucket, $destinationKey, $acl, [
        'state' => $e->getState(),
    ]);
}

@jeskew jeskew removed the response-requested Waiting on additional info and feedback. Will move to "closing-soon" in 7 days. label Apr 25, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants