mindspore.ops.RNNTLoss
- class mindspore.ops.RNNTLoss(blank_label=0)[源代码]
Computes the RNNTLoss and its gradient with respect to the softmax outputs.
- Parameters
blank_label (int) – blank label. Default: 0.
- Inputs:
acts (Tensor) - Tensor of shape \((B, T, U, V)\). Data type must be float16 or float32.
labels (Tensor) - Tensor of shape \((B, U-1)\). Data type is int32.
input_lengths (Tensor) - Tensor of shape \((B,)\). Data type is int32.
label_lengths (Tensor) - Tensor of shape \((B,)\). Data type is int32.
- Outputs:
costs (Tensor) - Tensor of shape \((B,)\). Data type is int32.
grads (Tensor) - Has the same shape and dtype as acts.
- Raises
- Supported Platforms:
Ascend
Examples
>>> B, T, U, V = 1, 2, 3, 5 >>> blank = 0 >>> acts = np.random.random((B, T, U, V)).astype(np.float32) >>> labels = np.array([[1, 2]]).astype(np.int32) >>> input_length = np.array([T] * B).astype(np.int32) >>> label_length = np.array([len(l) for l in labels]).astype(np.int32) >>> rnnt_loss = ops.RNNTLoss(blank_label=0) >>> costs, grads = rnnt_loss(Tensor(acts), Tensor(labels), Tensor(input_length), Tensor(label_length)) >>> print(costs.shape) (1,) >>> print(grads.shape) (1, 2, 3, 5)