Differences between torch.nn.Dropout and mindspore.nn.Dropout

View Source On Gitee

torch.nn.Dropout

class torch.nn.Dropout(
    p=0.5,
    inplace=False)

For more information, seetorch.nn.Dropout.

mindspore.nn.Dropout

class mindspore.nn.Dropout(
    keep_prob=0.5,
    dtype=mstype.float
)

For more information, seemindspore.nn.Dropout.

Use Pattern

PyTorch:p - Probability of an element to be zeroed. Default: 0.5.

PyTorch: The parameter P is the probability of discarding the parameter.

MindSpore:keep_prob (float) - The keep rate, greater than 0 and less equal than 1. E.g. rate=0.9, dropping out 10% of input units. Default: 0.5.

MindSpore:The parameter keep_prob is the probability of carding the parameter.

Code Example

# The following implements Dropout with MindSpore.
import torch.nn
import mindspore.nn
import numpy as np

m = torch.nn.Dropout(p=0.9)
input = torch.tensor(np.ones([5,5]),dtype=float)
output = m(input)
print(output)

# out:
#   [[0 10 0 0 0]
#   [0 0 0 0 0]
#   [0 0 10 0 0]
#   [0 10 0 0 0]
#   [0 0 0 0 10]]

input = mindspore.Tensor(np.ones([5,5]),mindspore.float32)
net = mindspore.nn.Dropout(keep_prob=0.1)
net.set_train()
output = net(input)
print(output)

# out:
#   [[0 10 0 0 0]
#   [0 0 0 10 0]
#   [0 0 0 0 0]
#   [0 10 10 0 0]
#   [0 0 10 0 0]]