-
Notifications
You must be signed in to change notification settings - Fork 492
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
File and dataset limits: Add a programmatic way to limit file size and dataset size #3939
Comments
@CCMumma thanks for the suggestion. I think I'm being tripped up by the word "programmatic", which to me means that a sysadmin can change the settings using a script written in Python or whatever. This is already possible. Currently, all limits affect the entire installation of Dataverse. Are you saying that you'd like limits to apply to sub-dataverses? That is, that https://dataverse.tdl.org/dataverse/utexas and https://dataverse.tdl.org/dataverse/tamu might have different limits? |
I discussed with @CCMumma at the Community Meeting. I didn't realize she wasn't aware of the file limiting that is available, but more generally we discussed dataset limits. This is currently unavailable - i.e you could upload as many 2gb files as you'd like to one dataset. I'm fairly certain that is what is being asked for here, but @CCMumma can confirm. :) |
It is, thanks... and specifically having the ability to set the limits in the GUI. Sorry, I wasn't aware of the python file limiting capability. |
To help highlight the fact that you can set limits, perhaps we should plug this in the new "Going Live: Launching Your Production Deployment" section I recently added to the Installation Guide. For now, you can preview it here: https://github.com/IQSS/dataverse/blob/9708f57ad970abb2b068e08d926dc9c318bdd8aa/doc/sphinx-guides/source/installation/config.rst#going-live-launching-your-production-deployment #938 is about setting storage quotas per user or group. Dataverse doesn't currently support any concept of storage quotas at any level (user, group, dataverse, dataset, etc.). |
Related: #4339 |
Related:
Also, thank you @CCMumma for bringing quotas up at the community call this week! https://docs.google.com/document/d/15yUGslKMUr4QCxGFuEsRyd2FdRoPWv3OzxZcVe5yEbs/edit?usp=sharing |
I would keep an eye on this: |
Just merged: |
2024/09/09: Close, currently addressed with collection size limiting functionality. |
TDR has set file (2gb) and dataset (10gb) size limits for the time being, but we can only enforce them manually. It would be good if setting limits were programmatic, set by admins. I am interested in knowing whether others are interested in this functionality and hope it can make its way to the roadmap at some point. We can probably commit some resources to helping out with use cases, docs, code and/or review.
The text was updated successfully, but these errors were encountered: