You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I trained AltCLIP using official demo code with dataset CIFAR10. However, after 3 epochs, the finetuned weight has no effect on cifar10 images. I wonder if my way to load weight is wrong or not.
I also finetuned on my own dataset and CIFAR10 both using official demo code,but both loss are fixed on a value and the output of finetuned model is the same value without considering the input value.
So I wonder if the finetune code has someting error about loss. Besides, I found the error in official demo code which only used the number label like 1,2 instead of the class name like dog.
[2024-01-23 05:42:53,494] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67350/ 198605 | elapsed time per iteration (ms): 434.3 | learning rate 8.804E-05 | loss 3.465734E+00 |
[2024-01-23 05:43:15,389] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67400/ 198605 | elapsed time per iteration (ms): 437.9 | learning rate 8.803E-05 | loss 3.465734E+00 |
[2024-01-23 05:43:37,527] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67450/ 198605 | elapsed time per iteration (ms): 442.8 | learning rate 8.802E-05 | loss 3.465734E+00 |
[2024-01-23 05:44:01,020] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67500/ 198605 | elapsed time per iteration (ms): 469.9 | learning rate 8.801E-05 | loss 3.465734E+00 |
[2024-01-23 05:44:21,606] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67550/ 198605 | elapsed time per iteration (ms): 411.7 | learning rate 8.799E-05 | loss 3.465734E+00 |
[2024-01-23 05:44:43,518] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67600/ 198605 | elapsed time per iteration (ms): 438.2 | learning rate 8.798E-05 | loss 3.465734E+00 |
[2024-01-23 05:45:05,781] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67650/ 198605 | elapsed time per iteration (ms): 445.3 | learning rate 8.797E-05 | loss 3.465734E+00 |
[2024-01-23 05:45:29,410] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67700/ 198605 | elapsed time per iteration (ms): 472.6 | learning rate 8.796E-05 | loss 3.465734E+00 |
[2024-01-23 05:45:53,111] [INFO] [logger.py:71:log_dist] [Rank -1] iteration 67750/ 198605 | elapsed time per iteration (ms): 474.0 | learning rate 8.794E-05 | loss 3.465734E+00 |
Description
I trained AltCLIP using official demo code with dataset CIFAR10. However, after 3 epochs, the finetuned weight has no effect on cifar10 images. I wonder if my way to load weight is wrong or not.
使用了官方的代码在cifar10上finetune后,对cifar中的动物图片识别反而完全失效了,想问下原因,是否是我加载权重的方式有问题?推理代码是用的demo里的。
Alternatives
No response
The text was updated successfully, but these errors were encountered: