-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
failed to parse error when uploading large file with arbitrary tags #15
Comments
This is a good bug with the limitations of the automatic mapping of the file size in elasticsearch. We'll have to fix this. |
You may be able to upload the data you requested by splitting up the single large file into smaller ones... say 10x 500Mb files instead of 1x 10Gb file for example |
https://www.elastic.co/guide/en/elasticsearch/reference/current/number.html Seems like integer is 32bit integer we need to evaluate the rest of the mapping see if 64bit integer is required for other parts of the model. |
This should make the size field of a file a long for elasticsearch matching the BigIntegerField peewee type.
Pacifica Metadata version
master
Pacifica Core Software versions
master
Platform Details
dev
Scenario:
Upload large file test, included arbitrary tags
Steps to Reproduce:
fallocate 3g file
upload
Expected Result:
File uploaded and accessible by status tool
Actual Result:
"transaction pending" from the uploader page never ended, status page did not show file as avaiable. File did reach ingester and was moved to the backend storage. See attached stack dump and metadata files.
stack_dump.txt
metadata.txt
The text was updated successfully, but these errors were encountered: