-
Notifications
You must be signed in to change notification settings - Fork 495
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate performance of dataset page, on datasets with hundreds of files. #4173
Comments
Another symptom: on a dataset with many files that has slow performance, clicking between tabs, files, metadata, terms, versions, is really slow, with a spinner appearing when each is clicked, almost as if it was reloading the file list. |
@kcondon |
-OK s3 image test with 91 images loads in 5 seconds, much better than previous 69s. |
Per @djbrooke's suggestion, here are links to the top 5 datasets with the most files in Harvard Dataverse. Could be helpful for testing these performance improvements. Note that all but #5 are only visible to a superuser account.
|
Noticed this while working on #4091, tried a few datasets with large numbers of files in production - all took a very long time to load; encountered a couple of datasets that would not load at all, resulting in 500 errors. This may be somewhat urgent. The performance appeared to be worse on draft versions (the "read-only mode" vs. full database retrieval?). So, in practical terms, it may be becoming impossible for some authors to manage their datasets.
Received an independent report from Sonia last night, about a 500 on a specific dataset (https://dataverse.harvard.edu/dataset.xhtml?persistentId=hdl:1902.1/00097-8).
I haven't found anything specific yet. My best guess is that we are doing some inefficient/unnecessary database look ups; possibly on something growing (for example, the already existing guestbook responses?) - This would explain why the performance is getting worse; and why we haven't been observing it on the dev. systems.
The text was updated successfully, but these errors were encountered: