Skip to content
/ tf-SmeLU Public

Tensorflow implementation of Smooth RELU (SmeLU) activation function

License

Notifications You must be signed in to change notification settings

r3v1/tf-SmeLU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Smooth ReLU in TensorFlow

drawing

Unofficial TensorFlow reimplementation of the Smooth ReLU (SmeLU) activation function proposed in the paper Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations by Gil I. Shamir and Dong Lin.

This repository includes an easy-to-use pure TensorFlow implementation of the Smooth ReLU.

Installation

The SmeLU can be installed by using pip.:

pip install tf-smelu

Example Usage

import tensorflow as tf
from tf_smelu import smelu

x = tf.range(-6, 6, 1, dtype=float)  # <tf.Tensor: numpy=array([-6., -5., -4., -3., -2., -1.,  0.,  1.,  2.,  3.,  4.,  5.], dtype=float32)>

smelu(x, beta=0.1)  # <tf.Tensor: numpy=array([0.,0.,0.,0.,0.,0.,0.025,1.,2.,3.,4.,5.], dtype=float32)>
smelu(x, beta=0.5)  # <tf.Tensor: numpy=array([0.,0.,0.,0.,0.,0.,0.125,1.,2.,3.,4.,5.], dtype=float32)>
smelu(x, beta=1.)   # <tf.Tensor: numpy=array([0.,0.,0.,0.,0.,0.,0.25 ,1.,2.,3.,4.,5.], dtype=float32)>

The SmeLU takes the following parameters.

  • beta: Half-width of a symmetric transition region around x = 0. Defaults to 1.

Reference

@article{Shamir2022,
        title={{Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations}},
        author={Shamir, Gil I and Lin, Dong},
        journal={{arXiv preprint arXiv:2202.06499}},
        year={2022}
}

About

Tensorflow implementation of Smooth RELU (SmeLU) activation function

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages