mindspore.nn.LossBase

class mindspore.nn.LossBase(reduction='mean')[source]

Base class for other losses.

Other losses derived from this should implement their own construct and use method self.get_loss to apply reduction to loss values.

Parameters

reduction (str) – Type of reduction to be applied to loss. The optional values are "mean" , "sum" , and "none" . Default: "mean" .

Raises

ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.

Supported Platforms:

Ascend GPU CPU

get_axis(x)[source]

Get a range of axis for input.

Parameters

x (Tensor) – Tensor of any shape.

Examples

>>> import mindspore
>>> from mindspore import ops, Tensor, nn
>>> import numpy as np
>>>
>>> class Net(nn.LossBase):
...     def __init__(self, reduction='mean'):
...         super(Net, self).__init__(reduction)
...         self.abs = ops.Abs()
...
...     def construct(self, logits, labels):
...         x = self.abs(logits - labels)
...         axis = self.get_axis(x)
...         return axis
>>> net = Net()
>>> # Case 1: logits.shape = labels.shape = (3,)
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> output = net(logits, labels)
>>> print(output)
(0,)
>>> # Case 2: logits.shape = labels.shape = (3, 3)
>>> logits = Tensor(np.array([[1, 2, 3],[1, 2, 3],[1, 2, 3]]), mindspore.float32)
>>> labels = Tensor(np.array([[1, 2, 3],[1, 2, 3],[1, 2, 3]]), mindspore.float32)
>>> output = net(logits, labels)
>>> print(output)
(0, 1)
get_loss(x, weights=1.0)[source]

Computes the weighted loss.

Parameters
  • x (Tensor) – Tensor of shape \((N, *)\) where \(*\) means, any number of additional dimensions.

  • weights (Union[float, Tensor]) – Optional Tensor whose rank is either 0, or the same rank as inputs, and must be broadcastable to inputs (i.e., all dimensions must be either 1, or the same as the corresponding inputs dimension). Default: 1.0 .

Returns

Return the weighted loss.

Examples

>>> import mindspore
>>> from mindspore import ops, Tensor, nn
>>> import numpy as np
>>>
>>> class Net(nn.LossBase):
...     def __init__(self, reduction='mean'):
...         super(Net, self).__init__(reduction)
...         self.abs = ops.Abs()
...
...     def construct(self, logits, labels):
...         x = self.abs(logits - labels)
...         output = self.get_loss(x)
...         return output
>>> net = Net()
>>> # Case 1: logits.shape = labels.shape = (3,)
>>> logits = Tensor(np.array([1, 2, 3]), mindspore.float32)
>>> labels = Tensor(np.array([1, 2, 2]), mindspore.float32)
>>> output = net(logits, labels)
>>> print(output)
0.33333334
>>> # Case 2: logits.shape = labels.shape = (3, 3)
>>> logits = Tensor(np.array([[1, 2, 3],[1, 2, 3],[1, 2, 3]]), mindspore.float32)
>>> labels = Tensor(np.array([[1, 2, 2],[1, 2, 3],[1, 2, 3]]), mindspore.float32)
>>> output = net(logits, labels)
>>> print(output)
0.11111111