Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about the loss function #6

Open
LiMinghui19 opened this issue Nov 23, 2021 · 1 comment
Open

about the loss function #6

LiMinghui19 opened this issue Nov 23, 2021 · 1 comment

Comments

@LiMinghui19
Copy link

Hi, thanks for your contribution.
when I read your paper, you use l2 loss as your loss function.
However, in your code, it seems like you used l1 loss, am I right?

Train.py:(line 25)
mse = nn.L1Loss().cuda()

so, the result in your paper is obtained by L1 or L2 loss?

@zzr-idam
Copy link
Owner

In the paper, we use the l2 loss function as targer, but effect of the l1 is good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants