Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the MegaDepth dataset for training. #74

Open
LuoXubo opened this issue Oct 25, 2024 · 2 comments
Open

Question about the MegaDepth dataset for training. #74

LuoXubo opened this issue Oct 25, 2024 · 2 comments

Comments

@LuoXubo
Copy link

LuoXubo commented Oct 25, 2024

Hi. I'm very interested in your work, XFeat. However, I met some problems when I tried to train the model. I've made the training megadepth dataset according to the instructions in your repository. And the dataset structure is as follows:

    ├── megadepth_root_path
    │   ├── train_data
    │   │   ├── megadepth_indeices
    │   │   │   ├── scene_info_0.1_0.7
    │   │   │   │   ├── 0000_0.1_0.3.npz
    │   │   │   │   ├── ...
    │   │   │   ├── ...
    │   ├── MegaDepth_v1
    │   │   ├── 000
    │   │   │   ├── dense0
    │   │   │   │   ├── depths
    │   │   │   │   ├── images
    │   │   │   ├── dense1
    │   │   ├── ...

The loading part of the dataset in the training code is as follows:

TRAIN_BASE_PATH = f"{config['megadepth_root_path']}/train_data/megadepth_indices"
TRAINVAL_DATA_SOURCE = f"{config['megadepth_root_path']}/MegaDepth_v1"

TRAIN_NPZ_ROOT = f"{TRAIN_BASE_PATH}/scene_info_0.1_0.7"

npz_paths = glob.glob(TRAIN_NPZ_ROOT + '/*.npz')[:]

data = torch.utils.data.ConcatDataset( [MegaDepthDataset(root_dir = TRAINVAL_DATA_SOURCE,
    npz_path = path) for path in tqdm.tqdm(npz_paths, desc="[MegaDepth] Loading metadata")] )

And I met the error when I ran the training code:

Traceback (most recent call last):
  File "train.py", line 279, in <module>
    trainer = Trainer(config_path = '../config.yaml')
  File "train.py", line 39, in __init__
    self.load_config(config_path)
  File "train.py", line 82, in load_config
    data = torch.utils.data.ConcatDataset( [MegaDepthDataset(root_dir = TRAINVAL_DATA_SOURCE,
  File "/home/anaconda3/envs/tavins/lib/python3.8/site-packages/torch/utils/data/dataset.py", line 398, in __init__
    assert len(self.datasets) > 0, 'datasets should not be an empty iterable'  # type: ignore[arg-type]
AssertionError: datasets should not be an empty iterable

I think there must be some problem with the dataset structure. Could you help me with the problem? Or could you please give me an example of the structure of the dataset? Thank you very much.

Looking forward to your reply!

@guipotje
Copy link
Collaborator

Hi @LuoXubo,

Thank you for your interest in our work.

The dataset structure seems correct; however, I noticed a typo, "megadepth_indeices," which could be causing an error. Were you able to resolve the issue?

Kind regards.

@LuoXubo
Copy link
Author

LuoXubo commented Oct 28, 2024

Hi, thanks for your reply. I've fixed the problem :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants