TensorFlow 2 version | View source on GitHub |
Compute the Leaky ReLU activation function.
tf.nn.leaky_relu(
features, alpha=0.2, name=None
)
Args | |
---|---|
features
|
A Tensor representing preactivation values. Must be one of
the following types: float16 , float32 , float64 , int32 , int64 .
|
alpha
|
Slope of the activation function at x < 0. |
name
|
A name for the operation (optional). |
Returns | |
---|---|
The activation value. |