-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about the train.py file line:119 #23
Comments
Thanks for your attention! Yep, the global_outputs should be reversed to match the label maps. Sorry for the mistake. |
It seems that we need to fine-tune the pre-trained model again. |
@mingloo @GengDavid Why global_outputs should be reversed? I think the feature map of low resolution should calculate loss with the heat map of ground truth which has large sigma.(such as label15 and the feature map of lowest resolution) |
Hi @moontsar @GengDavid @gag1223 I'll check this issue and feedback here. |
The input_size is 256x192 and the backbone is resnet50. I have reversed the global_outputs which match the targets, other settings are the same as the ones of the original code, I got the following results. I need three days to train the models. epoch22 |
The input_size is 256x192 and the backbone is resnet50. I have reversed the global_outputs which match the targets, other settings are the same as the ones of the original code, I got the following results with ground-truth labels on the COCO2017 val set. Epoch LR Train Loss epoch29 I have uploaded the trained models to https://pan.baidu.com/s/1w4prqCMV2AjORks2AvCR3w |
请问怎么reversed呢? |
for global_output, label in zip(global_outputs, targets):
num_points = global_output.size()[1]
global_label = label * (valid > 1.1).type(torch.FloatTensor).view(-1, num_points, 1, 1)
global_loss = criterion1(global_output,
torch.autograd.Variable(global_label.cuda(async=True))) / 2.0
loss += global_loss
global_loss_record += global_loss.data.item()
上面的代码是不是有错误:
global_outputs should be reversed?
The text was updated successfully, but these errors were encountered: