You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi. I'm very interested in your work, XFeat. However, I met some problems when I tried to train the model. I've made the training megadepth dataset according to the instructions in your repository. And the dataset structure is as follows:
Traceback (most recent call last):
File "train.py", line 279, in <module>
trainer = Trainer(config_path = '../config.yaml')
File "train.py", line 39, in __init__
self.load_config(config_path)
File "train.py", line 82, in load_config
data = torch.utils.data.ConcatDataset( [MegaDepthDataset(root_dir = TRAINVAL_DATA_SOURCE,
File "/home/anaconda3/envs/tavins/lib/python3.8/site-packages/torch/utils/data/dataset.py", line 398, in __init__
assert len(self.datasets) > 0, 'datasets should not be an empty iterable' # type: ignore[arg-type]
AssertionError: datasets should not be an empty iterable
I think there must be some problem with the dataset structure. Could you help me with the problem? Or could you please give me an example of the structure of the dataset? Thank you very much.
Looking forward to your reply!
The text was updated successfully, but these errors were encountered:
The dataset structure seems correct; however, I noticed a typo, "megadepth_indeices," which could be causing an error. Were you able to resolve the issue?
Hi. I'm very interested in your work, XFeat. However, I met some problems when I tried to train the model. I've made the training megadepth dataset according to the instructions in your repository. And the dataset structure is as follows:
The loading part of the dataset in the training code is as follows:
And I met the error when I ran the training code:
I think there must be some problem with the dataset structure. Could you help me with the problem? Or could you please give me an example of the structure of the dataset? Thank you very much.
Looking forward to your reply!
The text was updated successfully, but these errors were encountered: