Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sequence 21 broken: "Missing prediction files" error when submitting to completion benchmark on CodaLab #59

Closed
risteon opened this issue Sep 15, 2020 · 6 comments

Comments

@risteon
Copy link
Contributor

risteon commented Sep 15, 2020

Hi @jbehley,

I'm using the updated scene completion data V1.1 (downloaded September 10th, verified with a fresh download from just now).
However, when submitting my prediction files to CodaLab I get the following python exception that points to missing files:

WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
Traceback (most recent call last):
  File "/tmp/codalab/tmpI8vQas/run/program/evaluate_completion.py", line 150, in 
    if missing_pred_files: raise RuntimeError("Error: Missing prediction files! Aborting evaluation.")
RuntimeError: Error: Missing prediction files! Aborting evaluation.

I have checked my submission file beforehand using the semantic-kitti-api 'validate_submission.py' script and it reports that everything is okay and ready for submission (1:1 correspondence between input and prediction files).

I also noticed that the number of voxelized input files (.bin) between the old data version and the new V1.1 version has changed. Previously, sequence 21 had 523 .bin files, now it is only 432 .bin files. Is this reduction intentionally?

Thank you again for your support.

EDIT
I just noticed that CodaLab allows me to looked at the error log which lists the missing files (log is attached). All of them are in sequence 21 and none of the indices seems to be actually present within the V1.1 voxel data that I just downloaded.
So I guess either frames have gone missing for the official download file or the evaluation is still based on the old data.

risteon_submission_codalab_log.txt

@risteon risteon changed the title "Missing prediction files" error when submitting to completion benchmark on CodaLab Sequence 21 broken: "Missing prediction files" error when submitting to completion benchmark on CodaLab Sep 15, 2020
@jbehley
Copy link
Member

jbehley commented Sep 15, 2020

I will now have a look and apparently something went wrong when merging the zip files. ... I can confirm that sequence 21 changed. I will now regenerate the testset zip such that it fits to the data you already predicted.

@jbehley
Copy link
Member

jbehley commented Sep 15, 2020

I uploaded the cleaned data now and changed the data of the competition accordingly. Could you check again if your submission now works as expected?

Sorry for the trouble.

@risteon
Copy link
Contributor Author

risteon commented Sep 15, 2020

There seems to be another problem. The log now outputs:

WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
Traceback (most recent call last):
  File "/tmp/codalab/tmpxo6VIU/run/program/evaluate_completion.py", line 139, in <module>
    gt_file_list = [f for f in os.listdir(os.path.join(args.dataset, seq_dir_gt)) if f.endswith(".label")]
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/codalab/tmpxo6VIU/run/input/ref/sequences/11/voxels'

@jbehley
Copy link
Member

jbehley commented Sep 15, 2020

S****. Apparently I produced the wrong folder format. I have to fix it later in the evening. Sorry.

@risteon
Copy link
Contributor Author

risteon commented Sep 15, 2020

I will now regenerate the testset zip such that it fits to the data you already predicted.

Don't worry about the predictions. If the original 523 frames of sequence 21 were intended to be used, it might be useful to fix the voxel data download and benchmark data altogether.

@jbehley
Copy link
Member

jbehley commented Sep 15, 2020

Finally, the evaluation seems to run through and the submission gets scored. Let me know if you still experience problems.

@risteon risteon closed this as completed Sep 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants