Built-in activation functions.
deserialize(...)
: Returns activation function given a string identifier.
elu(...)
: Exponential Linear Unit.
exponential(...)
: Exponential activation function.
gelu(...)
: Applies the Gaussian error linear unit (GELU) activation function.
get(...)
: Returns function.
hard_sigmoid(...)
: Hard sigmoid activation function.
linear(...)
: Linear activation function (pass-through).
relu(...)
: Applies the rectified linear unit activation function.
selu(...)
: Scaled Exponential Linear Unit (SELU).
serialize(...)
: Returns the string identifier of an activation function.
sigmoid(...)
: Sigmoid activation function, sigmoid(x) = 1 / (1 + exp(-x))
.
softmax(...)
: Softmax converts a real vector to a vector of categorical probabilities.
softplus(...)
: Softplus activation function, softplus(x) = log(exp(x) + 1)
.
softsign(...)
: Softsign activation function, softsign(x) = x / (abs(x) + 1)
.
swish(...)
: Swish activation function, swish(x) = x * sigmoid(x)
.
tanh(...)
: Hyperbolic tangent activation function.
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/activations