mindspore.nn.SampledSoftmaxLoss
- class mindspore.nn.SampledSoftmaxLoss(num_sampled, num_classes, num_true=1, sampled_values=None, remove_accidental_hits=True, seed=0, reduction="none")[source]
Computes the sampled softmax training loss. This operator can accelerate the training of the softmax classifier over a large number of classes. It is generally an underestimate of the full softmax loss.
- Parameters
num_sampled (int) – The number of classes to randomly sample per batch.
num_classes (int) – The number of possible classes.
num_true (int) – The number of labels classes per training example. Default: 1.
sampled_values (Union[list, tuple]) – List or tuple of (sampled_candidates, true_expected_count, sampled_expected_count) returned by a *CandidateSampler function. Default to None, UniformCandidateSampler is applied.
remove_accidental_hits (bool) – Whether to remove “accidental hits” where a sampled class equals to one of the labels classes. Default: True.
seed (int) – Random seed for candidate sampling. Default: 0
reduction (str) – Type of reduction to be applied to loss. The optional values are “mean”, “sum”, and “none”. If “none”, do not perform reduction. Default: “none”.
- Inputs:
weights (Tensor) - Tensor of shape \((C, dim)\).
bias (Tensor) - Tensor of shape \((C,)\). The class biases.
labels (Tensor) - Tensor of shape \((N, num\_true)\), type int64, int32. The labels classes.
logits (Tensor) - Tensor of shape \((N, dim)\). The forward activations of the input network.
- Outputs:
Tensor or Scalar, if reduction is ‘none’, then output is a tensor with shape \((N,)\). Otherwise, the output is a scalar.
- Raises
TypeError – If sampled_values is not a list or tuple.
TypeError – If dtype of labels is neither int32 not int64.
ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.
ValueError – If num_sampled or num_true is greater than num_classes.
ValueError – If length of sampled_values is not equal to 3.
- Supported Platforms:
GPU
Examples
>>> mindspore.set_seed(1) >>> loss = nn.SampledSoftmaxLoss(num_sampled=4, num_classes=7, num_true=1) >>> weights = Tensor(np.random.randint(0, 9, [7, 10]), mindspore.float32) >>> biases = Tensor(np.random.randint(0, 9, [7]), mindspore.float32) >>> labels = Tensor([0, 1, 2]) >>> logits = Tensor(np.random.randint(0, 9, [3, 10]), mindspore.float32) >>> output = loss(weights, biases, labels, logits) >>> print(output) [4.6051701e+01 1.4000047e+01 6.1989022e-06]