cross_entropy
CrossEntropy
¶
Bases: LossOp
Calculate Element-Wise CrossEntropy (binary, categorical or sparse categorical).
Parameters:
Name | Type | Description | Default |
---|---|---|---|
inputs |
Union[Tuple[str, str], List[str]]
|
A tuple or list like: [ |
required |
outputs |
str
|
String key under which to store the computed loss value. |
required |
mode |
Union[None, str, Iterable[str]]
|
What mode(s) to execute this Op in. For example, "train", "eval", "test", or "infer". To execute regardless of mode, pass None. To execute in all modes except for a particular one, you can pass an argument like "!infer" or "!train". |
'!infer'
|
ds_id |
Union[None, str, Iterable[str]]
|
What dataset id(s) to execute this Op in. To execute regardless of ds_id, pass None. To execute in all ds_ids except for a particular one, you can pass an argument like "!ds1". |
None
|
from_logits |
bool
|
Whether y_pred is logits (without softmax). |
False
|
average_loss |
bool
|
Whether to average the element-wise loss after the Loss Op. |
True
|
form |
Optional[str]
|
What form of cross entropy should be performed ('binary', 'categorical', 'sparse', or None). None will
automatically infer the correct form based on tensor shape: if the both y_pred and y_true are rank-2 tensors
then 'categorical' will be used, if y_pred is rank-2 tensors but y_true is rank-1 tensor, then |
None
|
class_weights |
Optional[Dict[int, float]]
|
Dictionary mapping class indices to a weight for weighting the loss function. Useful when you need to pay more attention to samples from an under-represented class. |
None
|
Raises:
Type | Description |
---|---|
AssertionError
|
If |