View source on GitHub |
Compute the Leaky ReLU activation function.
tf.nn.leaky_relu(
features, alpha=0.2, name=None
)
Used in the notebooks
Used in the tutorials |
---|
Returns | |
---|---|
The activation value. |
References | |
---|---|
Rectifier Nonlinearities Improve Neural Network Acoustic Models: Maas et al., 2013 (pdf) |