You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We were able to execute the available open-source code of the paper on the TS-WorldCup dataset, mainly the code for training the model from the dataset.
Then we tried to mimic the dataset by having images as input and ground truth homography matrix as labels. We managed to train the model with your method. But when we tried to run the inference on the obtained model, the inference code requires having ground truth segmentations labels to calculate the lookup list which doesn't exist in the TS-WorldCup dataset in order to mimic it.
This is why we need your clarification on these points:
How did you generate the ground truth segmentation labels for inference?
How did you calculate the ground truth homography matrix exactly? We used the coordinates of the 4 extreme field corners in the image and the reference image grid.
Are there hardcoded constants that are specific for football field? (Besides the length and height of the reference image grid)
Is the code built on the assumption that the grid's length is greater than its height? (nx>ny)
Best regards,
The text was updated successfully, but these errors were encountered:
First of all, thank you for sharing this code.
We were able to execute the available open-source code of the paper on the TS-WorldCup dataset, mainly the code for training the model from the dataset.
Then we tried to mimic the dataset by having images as input and ground truth homography matrix as labels. We managed to train the model with your method. But when we tried to run the inference on the obtained model, the inference code requires having ground truth segmentations labels to calculate the lookup list which doesn't exist in the TS-WorldCup dataset in order to mimic it.
This is why we need your clarification on these points:
Best regards,
The text was updated successfully, but these errors were encountered: