Differences with torch.nn.Hardshrink

View Source On Gitee

torch.nn.Hardshrink

torch.nn.Hardshrink(lambd=0.5)(input) -> Tensor

For more information, see torch.nn.Hardshrink.

mindspore.nn.HShrink

mindspore.nn.HShrink(lambd=0.5)(input_x) -> Tensor

For more information, see mindspore.nn.HShrink.

Differences

PyTorch: Activation function, and calculate the output by the input elements.

MindSpore: MindSpore API implements the same function as PyTorch, and only the parameter names are different.

Categories

Subcategories

PyTorch

MindSpore

Difference

Parameter

Parameter 1

lambd

lambd

-

Input

Single input

input

input_x

Same function, different parameter names

Code Example

The two APIs achieve the same function and have the same usage.

# PyTorch
import torch
import torch.nn as nn

m = nn.Hardshrink()
input = torch.tensor([[0.5, 1, 2.0], [0.0533, 0.0776, -2.1233]], dtype=torch.float32)
output = m(input)
output = output.detach().numpy()
print(output)
# [[ 0.      1.      2.    ]
#  [ 0.      0.     -2.1233]]

# MindSpore
import mindspore
from mindspore import Tensor, nn
import numpy as np

input_x = Tensor(np.array([[0.5, 1, 2.0], [0.0533, 0.0776, -2.1233]]), mindspore.float32)
hshrink = nn.HShrink()
output = hshrink(input_x)
print(output)
# [[ 0.      1.      2.    ]
#  [ 0.      0.     -2.1233]]