Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to understand the loss function corner case for 0 and 255 #23

Closed
weixsong opened this issue Dec 14, 2017 · 3 comments
Closed

how to understand the loss function corner case for 0 and 255 #23

weixsong opened this issue Dec 14, 2017 · 3 comments

Comments

@weixsong
Copy link

Hi,

In the loss function, the code give a special value to value 0 and 255:

    # log probability for edge case of 0 (before scaling)
    log_cdf_plus = plus_in - tf.nn.softplus(plus_in)
    # log probability for edge case of 255 (before scaling)
    log_one_minus_cdf_min = -tf.nn.softplus(min_in)

I could not understand these code, how could me understand it?

@npuichigo
Copy link

I think you should know the relationship between sigmoid and softplus.
image

@weixsong
Copy link
Author

@npuichigo , thanks very much. I got it.

@chediBechikh
Copy link

Hi all,
the explanation is not clear here and what relation what exactly used for this line

log probability for edge case of 0 (before scaling)

log_cdf_plus = plus_in - tf.nn.softplus(plus_in)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants