Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add elu and selu activations #263

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

CarloLucibello
Copy link
Collaborator

No description provided.

@CarloLucibello
Copy link
Collaborator Author

this should be ready for merge after review

@cangumeli
Copy link
Collaborator

Shouldn't elu and selu be primitives if we add them to Knet?

@denizyuret
Copy link
Owner

denizyuret commented Feb 14, 2018 via email

@cangumeli
Copy link
Collaborator

If elu and selu will be added after each weight layer similar to other activations, defining them as composite operations may bring performance and memory overhead in training. I think they should be implemented similar to how sigm, relu etc. are implemented.

@CarloLucibello CarloLucibello mentioned this pull request Feb 19, 2018
src/unary.jl Outdated
p = relu(x)
m = -relu(-x)
return scale*(p + alpha*(exp(m) - 1))
end
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about using elu here?

function selu(x)
    alpha = Float32(1.6732632)
    scale = Float32(1.0507009)
    return scale * elu(x, alpha)
end 

@rened
Copy link

rened commented Mar 23, 2018

Elu can be included in unary.jl by adding
("elu", "elu", "(xi>0?xi:exp(xi)-1)"),
and
(:elu, :eluback, :(ifelse(xi>0,xi,exp(xi)-1)), :(ifelse(yi>0,dyi,yi+1))),
below the respective relu lines.

@@ -32,6 +32,7 @@ broadcast_ops = [
# "fdim",
("invxback","invxback","(-xi*yi*yi)"),
("reluback","reluback","(yi>0?xi:0)"),
("eluback", "eluback", "ifelse(yi>0,dyi,yi+1)"),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This part turns to a cuda code snippet. So, I think it must be replaced with the following:

("eluback", "eluback", "(yi>0?xi:yi+1)")

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand the comment. ifelse(yi>0,dyi,yi+1) is valid julia code and should be the right derivative

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When I tried to build the package, it threw following errors:

cuda01.cu(395): error: identifier "dyi" is undefined
cuda01.cu(395): error: identifier "ifelse" is undefined
cuda01.cu(408): error: identifier "dyi" is undefined
cuda01.cu(408): error: identifier "ifelse" is undefined

I am able to build the package with the ("eluback", "eluback", "(yi>0?xi:yi+1)") code snippet.

@denizyuret
Copy link
Owner

denizyuret commented Apr 17, 2018 via email

@denizyuret
Copy link
Owner

Manually merged elu after fixing the faulty gradient: for negative values the derivative should be dyi*(yi+1).

@denizyuret
Copy link
Owner

Added selu as a cuda kernel for efficiency.

@denizyuret
Copy link
Owner

@CarloLucibello can you explain how this gives the intended result in alpha_dropout:
x = q*dropout(x .- alpha, p) .+ alpha #set dropped input to alpha

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants