|View source on GitHub|
Create a regularizer that applies both L1 and L2 penalties.
Compat aliases for migration
See Migration guide for more details.
tf.keras.regularizers.l1_l2( l1=0.01, l2=0.01 )
The L1 regularization penalty is computed as:
loss = l1 * reduce_sum(abs(x))
The L2 regularization penalty is computed as:
loss = l2 * reduce_sum(square(x))
| ||Float; L1 regularization factor.|
| ||Float; L2 regularization factor.|
|An L1L2 Regularizer with the given regularization factors.|
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.