_categorical_crossentropy
categorical_crossentropy
¶
Compute categorical crossentropy.
Note that if any of the y_pred
values are exactly 0, this will result in a NaN output. If from_logits
is
False, then each entry of y_pred
should sum to 1. If they don't sum to 1 then tf and torch backends will
result in different numerical values.
This method can be used with TensorFlow tensors:
true = tf.constant([[0, 1, 0], [1, 0, 0], [0, 0, 1]])
pred = tf.constant([[0.1, 0.8, 0.1], [0.9, 0.05, 0.05], [0.1, 0.2, 0.7]])
weights = tf.lookup.StaticHashTable(
tf.lookup.KeyValueTensorInitializer(tf.constant([1, 2]), tf.constant([2.0, 3.0])), default_value=1.0)
b = fe.backend.categorical_crossentropy(y_pred=pred, y_true=true) # 0.228
b = fe.backend.categorical_crossentropy(y_pred=pred, y_true=true, average_loss=False) # [0.223, 0.105, 0.356]
b = fe.backend.categorical_crossentropy(y_pred=pred, y_true=true, average_loss=False, class_weights=weights)
# [0.446, 0.105, 1.068]
This method can be used with PyTorch tensors:
true = torch.tensor([[0, 1, 0], [1, 0, 0], [0, 0, 1]])
pred = torch.tensor([[0.1, 0.8, 0.1], [0.9, 0.05, 0.05], [0.1, 0.2, 0.7]])
weights = {1: 2.0, 2: 3.0}
b = fe.backend.categorical_crossentropy(y_pred=pred, y_true=true) # 0.228
b = fe.backend.categorical_crossentropy(y_pred=pred, y_true=true, average_loss=False) # [0.223, 0.105, 0.356]
b = fe.backend.categorical_crossentropy(y_pred=pred, y_true=true, average_loss=False, class_weights=weights)
# [0.446, 0.105, 1.068]
Parameters:
Name | Type | Description | Default |
---|---|---|---|
y_pred |
Tensor
|
Prediction with a shape like (Batch, ..., C) for tensorflow and (Batch, C, ...) for PyTorch. dtype: float32 or float16. |
required |
y_true |
Tensor
|
Ground truth class labels with a shape like |
required |
from_logits |
bool
|
Whether y_pred is from logits. If True, a softmax will be applied to the prediction. |
False
|
average_loss |
bool
|
Whether to average the element-wise loss. |
True
|
class_weights |
Optional[Weight_Dict]
|
Mapping of class indices to a weight for weighting the loss function. Useful when you need to pay more attention to samples from an under-represented class. |
None
|
Returns:
Type | Description |
---|---|
Tensor
|
The categorical crossentropy between |
Tensor
|
tensor with the shape (Batch). |
Raises:
Type | Description |
---|---|
AssertionError
|
If |