mindspore.ops.cross_entropyο
- mindspore.ops.cross_entropy(input, target, weight=None, ignore_index=- 100, reduction='mean', label_smoothing=0.0)[source]ο
The cross entropy loss between input and target.
The cross entropy support two kind of targets:
Class indices (int) in the range
where is the number of classes, the loss with reduction=none can be described as:where
is the inputs, is the target, is the weight, N is the batch size, belonging to is class index, where is the number of classes.If reduction is not βnoneβ (default βmeanβ), then
Probabilities (float) for each class, useful when labels beyond a single class per minibatch item are required, the loss with reduction=none can be described as:
where
is the inputs, is the target, is the weight, N is the batch size, belonging to is class index, where is the number of classes.If reduction is not βnoneβ (default βmeanβ), then
- Parameters
input (Tensor) β
or where C = number of classes or in case of 2D Loss, or . input is expected to be log-probabilities, data type must be float16 or float32.target (Tensor) β For class indices, tensor of shape
, or , data type must be int32. For probabilities, tensor of shape or , data type must be float16 or float32.weight (Tensor) β A rescaling weight applied to the loss of each batch element. If not None, the shape is
, data type must be float16 or float32. Default: None.ignore_index (int) β Specifies a target value that is ignored and does not contribute to the input gradient. Default: -100
reduction (str) β Apply specific reduction method to the output: βnoneβ, βmeanβ, or βsumβ. Default: βmeanβ.
label_smoothing (float) β Label smoothing values, a regularization tool used to prevent the model from overfitting when calculating Loss. The value range is [0.0, 1.0]. Default value: 0.0.
- Returns
Tensor, the computed loss value.
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> # Case 1: Indices labels >>> inputs = mindspore.Tensor(np.random.randn(3, 5), mindspore.float32) >>> target = mindspore.Tensor(np.array([1, 0, 4]), mindspore.int32) >>> output = ops.cross_entropy(inputs, target) >>> # Case 2: Probability labels >>> inputs = mindspore.Tensor(np.random.randn(3, 5), mindspore.float32) >>> target = mindspore.Tensor(np.random.randn(3, 5), mindspore.float32) >>> output = ops.cross_entropy(inputs, target)