View source on GitHub |
Mish activation function.
tfm.utils.activations.mish(
x
) -> tf.Tensor
Mish: A Self Regularized Non-Monotonic Activation Function https://arxiv.org/pdf/1908.08681.pdf
Mish(x) = x * tanh(ln(1+e^x))
Args | |
---|---|
x
|
A Tensor representing preactivation values.
|
Returns | |
---|---|
The activation value. |