W3cubDocs

/TensorFlow 1.15

Module: tf.keras.activations

Built-in activation functions.

Functions

deserialize(...)

elu(...): Exponential linear unit.

exponential(...): Exponential activation function.

get(...)

hard_sigmoid(...): Hard sigmoid activation function.

linear(...): Linear activation function.

relu(...): Rectified Linear Unit.

selu(...): Scaled Exponential Linear Unit (SELU).

serialize(...)

sigmoid(...): Sigmoid.

softmax(...): The softmax activation function transforms the outputs so that all values are in

softplus(...): Softplus activation function.

softsign(...): Softsign activation function.

tanh(...): Hyperbolic Tangent (tanh) activation function.

© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/activations