Computes the categorical crossentropy loss.
tf.keras.metrics.categorical_crossentropy(
    y_true, y_pred, from_logits=False, label_smoothing=0.0, axis=-1
)
  y_true = [[0, 1, 0], [0, 0, 1]] y_pred = [[0.05, 0.95, 0], [0.1, 0.8, 0.1]] loss = tf.keras.losses.categorical_crossentropy(y_true, y_pred) assert loss.shape == (2,) loss.numpy() array([0.0513, 2.303], dtype=float32)
| Args | |
|---|---|
| y_true | Tensor of one-hot true targets. | 
| y_pred | Tensor of predicted targets. | 
| from_logits | Whether y_predis expected to be a logits tensor. By default, we assume thaty_predencodes a probability distribution. | 
| label_smoothing | Float in [0, 1]. If > 0then smooth the labels. For example, if0.1, use0.1 / num_classesfor non-target labels and0.9 + 0.1 / num_classesfor target labels. | 
| axis | Defaults to -1. The dimension along which the entropy is computed. | 
| Returns | |
|---|---|
| Categorical crossentropy loss value. | 
    © 2022 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 4.0.
Code samples licensed under the Apache 2.0 License.
    https://www.tensorflow.org/versions/r2.9/api_docs/python/tf/keras/metrics/categorical_crossentropy