You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hello, thank for your excellent work!
I met a n error when i run CUDA_VISIBLE_DEVICES='0,1,2,3' python -m torch.distributed.launch --nproc_per_node 4 main.py --config config/release_iccv/latr_1000_baseline.py,
the error message is:
Traceback (most recent call last):
File "main.py", line 48, in <module>
main()
File "main.py", line 42, in main
runner.train()
File "/data/usr/LATR/experiments/runner.py", line 184, in train
optimizer.step()
File "/home/usr/miniconda3/envs/latr_/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 65, in wrapper
return wrapped(*args, **kwargs)
File "/home/usr/miniconda3/envs/latr_/lib/python3.8/site-packages/torch/optim/optimizer.py", line 89, in wrapper
return func(*args, **kwargs)
File "/home/usr/miniconda3/envs/latr_/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/home/usr/miniconda3/envs/latr_/lib/python3.8/site-packages/torch/optim/adamw.py", line 117, in step
beta1,
UnboundLocalError: local variable 'beta1' referenced before assignment
After changing, the error disappeared.
Why does the above error occur?
Will these changes I make affect performance?
Expect the above method, what should I do to avoid the above error?
The text was updated successfully, but these errors were encountered:
hello, thank for your excellent work!
I met a n error when i run
CUDA_VISIBLE_DEVICES='0,1,2,3' python -m torch.distributed.launch --nproc_per_node 4 main.py --config config/release_iccv/latr_1000_baseline.py
,the error message is:
And then I change the config file as:
After changing, the error disappeared.
Why does the above error occur?
Will these changes I make affect performance?
Expect the above method, what should I do to avoid the above error?
The text was updated successfully, but these errors were encountered: