mindspore.nn.CosineEmbeddingLoss
- class mindspore.nn.CosineEmbeddingLoss(margin=0.0, reduction='mean')[source]
Computes the similarity between two tensors using cosine distance.
Given two tensors x1, x2, and a Tensor label y with values 1 or -1:
\[\begin{split}loss(x_1, x_2, y) = \begin{cases} 1-cos(x_1, x_2), & \text{if } y = 1\\ max(0, cos(x_1, x_2)-margin), & \text{if } y = -1\\ \end{cases}\end{split}\]- Parameters
- Inputs:
logits_x1 (Tensor) - Input tensor.
logits_x2 (Tensor) - Its shape and data type must be the same as logits_x1’s shape and data type.
labels (Tensor) - Contains value 1 or -1. Suppose the shape of logits_x1 is \((x_1, x_2, x_3,..., x_R)\), then the shape of labels must be \((x_1, x_3, x_4, ..., x_R)\).
- Outputs:
loss (Tensor) - If reduction is “none”, its shape is the same as labels’s shape, otherwise a scalar value will be returned.
- Raises
TypeError – If margin is not a float.
ValueError – If reduction is not one of ‘none’, ‘mean’, ‘sum’.
ValueError – If margin is not in range [-1, 1].
- Supported Platforms:
Ascend
GPU
Examples
>>> logits_x1 = Tensor(np.array([[0.3, 0.8], [0.4, 0.3]]), mindspore.float32) >>> logits_x2 = Tensor(np.array([[0.4, 1.2], [-0.4, -0.9]]), mindspore.float32) >>> labels = Tensor(np.array([1, -1]), mindspore.int32) >>> cosine_embedding_loss = nn.CosineEmbeddingLoss() >>> output = cosine_embedding_loss(logits_x1, logits_x2, labels) >>> print(output) 0.0003426075