-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(core): S3 copy object with foreign characters #17151
Conversation
WalkthroughThe changes involve a modification to the Changes
Possibly related PRs
Suggested labels
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
Datadog ReportAll test runs ✅ 34 Total Test Services: 0 Failed, 32 Passed Test ServicesThis report shows up to 10 services
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (2)
libs/nest/aws/src/lib/s3.service.ts (2)
66-66
: LGTM! Proper encoding for S3 CopySourceThe addition of
encodeURIComponent
is the correct solution for handling foreign characters in S3 copy operations. This follows AWS best practices for handling special characters in S3 object keys.Consider adding a code comment explaining why the encoding is necessary, for future maintainers:
- CopySource: encodeURIComponent(copySource), + // URI encode to handle foreign/special characters in S3 object keys + CopySource: encodeURIComponent(copySource),
Line range hint
1-275
: Implementation follows best practicesThe S3Service implementation adheres to the coding guidelines:
- ✓ Properly typed with TypeScript
- ✓ Reusable across different NextJS apps
- ✓ Follows effective tree-shaking practices with named imports
Consider extracting the S3 URI encoding logic into a separate utility function if similar encoding needs are identified in other operations. This would promote code reuse and consistency across the service.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
📒 Files selected for processing (1)
libs/nest/aws/src/lib/s3.service.ts
(1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
libs/nest/aws/src/lib/s3.service.ts (1)
Pattern libs/**/*
: "Confirm that the code adheres to the following:
- Reusability of components and hooks across different NextJS apps.
- TypeScript usage for defining props and exporting types.
- Effective tree-shaking and bundling practices."
🔇 Additional comments (1)
libs/nest/aws/src/lib/s3.service.ts (1)
Line range hint 63-75
: Verify similar S3 operations
While the fix for copyObject
is correct, let's verify if other S3 operations in the service might need similar encoding for consistency.
✅ Verification successful
No encoding needed for other S3 operations
The encodeURIComponent
is only necessary for the CopySource
parameter in the CopyObjectCommand
as it represents a URL path that may contain special characters. Other S3 operations found in the service (GetObjectCommand
, HeadObjectCommand
, DeleteObjectCommand
) only use Bucket
and Key
parameters which don't require URL encoding as they are simple identifiers.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for other S3 operations that might need encoding
ast-grep --pattern 'new $_(Command|Upload) ({
$$$
Key: $_,
$$$
})'
Length of output: 75
Script:
#!/bin/bash
# Let's try a different pattern to search for S3 operations
rg "new \w+(Command|Upload)" -A 5 libs/nest/aws/src/lib/s3.service.ts
Length of output: 853
* Testing overwriting the metadata * chore: nx format:write update dirty files * Found a cleaner work around --------- Co-authored-by: andes-it <builders@andes.is> Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
Encode URI to allow s3 to upload unicode characters
What
Error is thrown when uploading a file with foreign characters, this is a known issue with the copy feature of s3 in the aws sdk v3
Why
So it works
Screenshots / Gifs
Attach Screenshots / Gifs to help reviewers understand the scope of the pull request
Checklist:
Summary by CodeRabbit