Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add PReLU layer #54

Closed
zaleslaw opened this issue May 18, 2021 · 4 comments · Fixed by #85
Closed

Add PReLU layer #54

zaleslaw opened this issue May 18, 2021 · 4 comments · Fixed by #85
Assignees
Labels
good first issue Good for newcomers
Milestone

Comments

@zaleslaw
Copy link
Collaborator

We are missing some activation layers to support the export of models from Keras fully. One of them is the PReLU layer.

Add an activation layer class, write documentation for it, write a test for it, try, if possible, create a small trainable network with it (in your own GitHub) and attach a link here in the comments.

The layer should be placed here

As a reference implementation, the ReLU activation layer could be used, but feel free to improve it!

Also, support for export and import of activation layer in JSON format should be added (see ModelLoader.kt and ModelSaver.kt)

A detailed description of the activation layer can be found here

@zaleslaw zaleslaw added the good first issue Good for newcomers label May 18, 2021
@zaleslaw zaleslaw added this to the 0.3 milestone May 18, 2021
@mkaze
Copy link
Contributor

mkaze commented Jun 1, 2021

I'll take this one and work on it.

@mkaze mkaze mentioned this issue Jun 1, 2021
4 tasks
@therealansh
Copy link
Contributor

I think PReLU requires Regularization of weights, to calculate the alpha. And the regularizations is yet to be implemented (#83 ) i guess?

@mkaze
Copy link
Contributor

mkaze commented Jun 1, 2021

@therealansh A regularizer is not necessarily used; actually, you can optionally provide a regularizer as well, but in the default case it works without any regularization or constraint on the weights. Nonetheless, it should be added in any case for the sake of completeness and compatibility; so I have left a TODO comment for that in my implementation.

@zaleslaw
Copy link
Collaborator Author

zaleslaw commented Jun 1, 2021

Good comment, @therealansh, about regularizers, but now it's not supported yet and will be an issue later (I'll suggest adding an issue about that).

So, agree with @mkaze we could work with a simple PReLU version now (adding a TODO with a link on the newly created issue and #83 could be a good solution)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants