-
-
Notifications
You must be signed in to change notification settings - Fork 529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
S3 Speed Up #16438
S3 Speed Up #16438
Conversation
…in an effort to dramatically speed up S3 response times.
Codecov ReportPatch coverage has no change and project coverage change:
Additional details and impacted files@@ Coverage Diff @@
## 3.x #16438 +/- ##
============================================
- Coverage 21.73% 21.67% -0.07%
- Complexity 10482 20994 +10512
============================================
Files 561 1122 +561
Lines 31612 63386 +31774
============================================
+ Hits 6872 13739 +6867
- Misses 24740 49647 +24907
☔ View full report in Codecov by Sentry. |
I don't have access to an S3 bucket, is there another way I can test this? |
@theboxer can you review this |
@JoshuaLuckers I believe Amazon offers a free tier https://aws.amazon.com/free/storage/s3/ |
* remove functions that require PHP to interact with objects directly, in an effort to dramatically speed up S3 response times. * fix PHPCS * fix type conflict
What does it do?
Removed the functions that process each file individually when listing files, e.g. lastModified, fileSize, mimeType, e.g.
Why is it needed?
Currently, the S3 bucket will time out as you grow the contents of your folders. This dramatically increases the responses and has been tested on folders containing hundreds of files.
How to test
The best way to test is with an S3 bucket with numerous sample files. The file size doesn't matter, just a large number. As the number grows, the response time increases.
Related issue(s)/PR(s)
No related public issues, just personal experience