-
-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make object upload size limit at bucket level configurable #134
Comments
@kiwicopple, @thebengeu, @alaister, Any comments...! |
Hey @rahul3v, We discussed this in our most recent storage meeting, and we think it's a great idea! I've added it to our internal to-do list but can't give you any timeline on when this might be implemented. Thanks again, and as always, PRs are welcome :) |
@alaister, if possible extend it to folder level not to deep needed, 1-folder-level will be ok or at most 2-folder-level :) |
This would be awesome. Having the max storage size for all buckets set to one size is rough. |
This is now being shipped with #277 |
Is this functionality read to use now? If so how to we use it? |
@fenos Related to this. Could we define a max for the bucket itself. (I want to set a maximum for the entire bucket (sum of all files inside it) |
Hi @inian . |
Hi @haexhub, that is exactly what the feature does - https://supabase.com/docs/guides/storage/buckets/creating-buckets#restricting-uploads |
Hi @inian, I am looking for a maxBucketSize flag as opposed to setting a limit for each file.
So, in the example above, if the user uploaded a 1000 images each 1MB .. it would consume the limit. |
Ah you want a limit on the total size of the files in the bucket. What is your use case for this feature? |
@inian I want to set upper limit primarily to control cost. Each user in my system gets a bucket and I should have some restrictions on how much they upload, and be able to change bucket size based on whether they are paid users or no. Having no restrictions can have significant cost risk. This potentially could be solved with RLS, the only issue with that I will need to let the user know that they exceeded the limit in an error message, which is not possible with RLS. |
Exactly. Its not just the cost itself. Its also just the limit of available space. We always have a limit amount of disk space and I want to be able, to share that evenly. I want to give each user a bucket of a specific size, so I'm able, to divide the available space evenly and not having one user consuming all available space. |
Since we can use SQL to operate on storage buckets, one can create insert/delete object triggers that will aggregate the total size of objects in a bucket. Then, if necessary, the maxFileSize can be set to min(maxFileSize, maxBucketSize - totalFileSize). This will prevent users from uploading files exceeding the max bucket size. I didn't implement this logic myself, but it sounds reasonable :-) |
Feature request
Make object upload size limit at bucket level configurable (even batter if you extend it to folder level)
Is your feature request related to a problem? Please describe.
It will help one to limit users to uploading a limited file size to a bucket/folder, (as an example profile pics, max needed 1MB)
Since frontend file limit check can be bypassed via some tools, so it will be batter to have some checks at backend before upload
Even we have a configurable setting to limit the max file size, that will not help much, It will open up to a user to upload at that max limit, which as per project/product structure might not needed all the time. and the issue here is a user might full the storage bucket with raw data.
Describe the solution you'd like
storage.filesize()
)A clear and concise description of what you want to happen.
I want, when user upload a file to a bucket, I can check for the file size before uploading into the bucket. at backend side
Let say your limit is 50MB
So, I can able to configure my buckets somewhere between (0-50MB) as per my/project need like,
that will help me use my storage precisely with my storage volume limits
Describe alternatives you've considered
Otherwise we have to use any server function just for that all the time before uploading file to storage bucket, like doing same thing again one more time, (one for my project buckets limit check running at my server code, and other by your(supabase) limit check running at your code.)
That will ignore the real use of supabase storage api to upload it directly from client side.
The text was updated successfully, but these errors were encountered: