mindspore.ops.SoftmaxCrossEntropyWithLogits

View Source On Gitee
class mindspore.ops.SoftmaxCrossEntropyWithLogits[source]

Gets the softmax cross-entropy value between logits and labels with one-hot encoding.

The updating formulas of SoftmaxCrossEntropyWithLogits algorithm are as follows,

pij=softmax(Xij)=exp(xi)j=0N1exp(xj)lossij=jYijln(pij)

where X represents logits. Y represents label. loss represents output.

Inputs:
  • logits (Tensor) - Input logits, with shape (N,C). Data type must be float16 or float32.

  • labels (Tensor) - Ground truth labels, with shape (N,C), has the same data type with logits.

Outputs:

Tuple of 2 tensors( loss , dlogits ), the loss shape is (N,), and the dlogits with the same shape as logits.

Raises
  • TypeError – If dtype of logits or labels is neither float16 nor float32.

  • TypeError – If logits or labels is not a Tensor.

  • ValueError – If shape of logits is not the same as labels.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> from mindspore import Tensor, ops
>>> logits = Tensor([[2, 4, 1, 4, 5], [2, 1, 2, 4, 3]], mindspore.float32)
>>> labels = Tensor([[0, 0, 0, 0, 1], [0, 0, 0, 1, 0]], mindspore.float32)
>>> softmax_cross = ops.SoftmaxCrossEntropyWithLogits()
>>> loss, dlogits = softmax_cross(logits, labels)
>>> print(loss)
[0.5899297  0.52374405]
>>> print(dlogits)
[[ 0.02760027  0.20393994  0.01015357  0.20393994 -0.44563377]
 [ 0.08015892  0.02948882  0.08015892 -0.4077012   0.21789455]]