Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failed to parse error when uploading large file with arbitrary tags #15

Closed
czebotar opened this issue Apr 21, 2017 · 3 comments
Closed

Comments

@czebotar
Copy link

Pacifica Metadata version

master

Pacifica Core Software versions

master

Platform Details

dev

Scenario:

Upload large file test, included arbitrary tags

Steps to Reproduce:

fallocate 3g file
upload

Expected Result:

File uploaded and accessible by status tool

Actual Result:

"transaction pending" from the uploader page never ended, status page did not show file as avaiable. File did reach ingester and was moved to the backend storage. See attached stack dump and metadata files.
stack_dump.txt
metadata.txt

@dmlb2000
Copy link
Member

This is a good bug with the limitations of the automatic mapping of the file size in elasticsearch. We'll have to fix this.

@dmlb2000
Copy link
Member

You may be able to upload the data you requested by splitting up the single large file into smaller ones... say 10x 500Mb files instead of 1x 10Gb file for example

@dmlb2000
Copy link
Member

https://www.elastic.co/guide/en/elasticsearch/reference/current/number.html

Seems like integer is 32bit integer we need to evaluate the rest of the mapping see if 64bit integer is required for other parts of the model.

dmlb2000 added a commit to dmlb2000/pacifica-metadata that referenced this issue Apr 24, 2017
This should make the size field of a file a long for elasticsearch
matching the BigIntegerField peewee type.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants