You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have no way to solve this kind of problem. This kind of problem is usually caused by data processing, but I can't find the error. What can I do?
D:\annaconda3\envs\aimbot_env\python.exe J:/stereo-transformer-main/stereo-transformer/main.py
number of params in backbone: 1,050,800
number of params in transformer: 797,440
number of params in tokenizer: 503,728
number of params in regression: 161,843
0%| | 0/11195 [00:00<?, ?it/s]Start training
Epoch: 0
0%| | 0/11195 [00:03<?, ?it/s]
Traceback (most recent call last):
File "J:/stereo-transformer-main/stereo-transformer/main.py", line 263, in
main(args_)
File "J:/stereo-transformer-main/stereo-transformer/main.py", line 234, in main
args.clip_max_norm, amp)
File "J:\stereo-transformer-main\stereo-transformer\utilities\train.py", line 30, in train_one_epoch
for idx, data in enumerate(tbar):
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\tqdm\std.py", line 1180, in iter
for obj in iterable:
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data\dataloader.py", line 521, in next
data = self._next_data()
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data\dataloader.py", line 1203, in _next_data
return self._process_data(data)
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data\dataloader.py", line 1229, in _process_data
data.reraise()
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch_utils.py", line 434, in reraise
raise exception
RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\worker.py", line 287, in _worker_loop
data = fetcher.fetch(index)
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\fetch.py", line 52, in fetch
return self.collate_fn(data)
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 74, in default_collate
return {key: default_collate([d[key] for d in batch]) for key in elem}
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 74, in
return {key: default_collate([d[key] for d in batch]) for key in elem}
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 64, in default_collate
return default_collate([torch.as_tensor(b) for b in batch])
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 56, in default_collate
return torch.stack(batch, 0, out=out)
RuntimeError: stack expects each tensor to be equal size, but got [531, 674, 3] at entry 0 and [380, 885, 3] at entry 1
The text was updated successfully, but these errors were encountered:
Yes, I have met the same problem.
This is because of the random crop augmentation in the data loader process.
If you want to set the batch size > 2, you can do the random crop process in the function ( forward_pass ) rather than in the definition of getitem.
When you set the batch size > 2, another problem may occur.
It is because the batch size equals to 1 in the original code.
You can see line 109 in the transform.py file.
I have no way to solve this kind of problem. This kind of problem is usually caused by data processing, but I can't find the error. What can I do?
D:\annaconda3\envs\aimbot_env\python.exe J:/stereo-transformer-main/stereo-transformer/main.py
number of params in backbone: 1,050,800
number of params in transformer: 797,440
number of params in tokenizer: 503,728
number of params in regression: 161,843
0%| | 0/11195 [00:00<?, ?it/s]Start training
Epoch: 0
0%| | 0/11195 [00:03<?, ?it/s]
Traceback (most recent call last):
File "J:/stereo-transformer-main/stereo-transformer/main.py", line 263, in
main(args_)
File "J:/stereo-transformer-main/stereo-transformer/main.py", line 234, in main
args.clip_max_norm, amp)
File "J:\stereo-transformer-main\stereo-transformer\utilities\train.py", line 30, in train_one_epoch
for idx, data in enumerate(tbar):
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\tqdm\std.py", line 1180, in iter
for obj in iterable:
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data\dataloader.py", line 521, in next
data = self._next_data()
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data\dataloader.py", line 1203, in _next_data
return self._process_data(data)
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data\dataloader.py", line 1229, in _process_data
data.reraise()
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch_utils.py", line 434, in reraise
raise exception
RuntimeError: Caught RuntimeError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\worker.py", line 287, in _worker_loop
data = fetcher.fetch(index)
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\fetch.py", line 52, in fetch
return self.collate_fn(data)
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 74, in default_collate
return {key: default_collate([d[key] for d in batch]) for key in elem}
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 74, in
return {key: default_collate([d[key] for d in batch]) for key in elem}
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 64, in default_collate
return default_collate([torch.as_tensor(b) for b in batch])
File "D:\annaconda3\envs\aimbot_env\lib\site-packages\torch\utils\data_utils\collate.py", line 56, in default_collate
return torch.stack(batch, 0, out=out)
RuntimeError: stack expects each tensor to be equal size, but got [531, 674, 3] at entry 0 and [380, 885, 3] at entry 1
The text was updated successfully, but these errors were encountered: