| View source on GitHub | 
Create a regularizer that applies both L1 and L2 penalties.
tf.keras.regularizers.l1_l2(
    l1=0.01, l2=0.01
)
  The L1 regularization penalty is computed as: loss = l1 * reduce_sum(abs(x))
The L2 regularization penalty is computed as: loss = l2 * reduce_sum(square(x))
| Args | |
|---|---|
| l1 | Float; L1 regularization factor. | 
| l2 | Float; L2 regularization factor. | 
| Returns | |
|---|---|
| An L1L2 Regularizer with the given regularization factors. | 
    © 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
    https://www.tensorflow.org/versions/r2.9/api_docs/python/tf/keras/regularizers/l1_l2