mindspore.mint.nn.Dropout

View Source On Gitee
class mindspore.mint.nn.Dropout(p=0.5, inplace=False)[source]

Dropout layer for the input.

Dropout is a means of regularization that reduces overfitting by preventing correlations between neuronal nodes. The operator randomly sets some neurons output to 0 according to p, which means the probability of discarding during training. And the return will be multiplied by \(\frac{1}{1-p}\) during training. During the reasoning, this layer returns the same Tensor as the x.

This technique is proposed in paper Dropout: A Simple Way to Prevent Neural Networks from Overfitting and proved to be effective to reduce over-fitting and prevents neurons from co-adaptation. See more details in Improving neural networks by preventing co-adaptation of feature detectors.

Note

  • Each channel will be zeroed out independently on every construct call.

  • Parameter p means the probability of the element of the input tensor to be zeroed.

Parameters
  • p (float) – The dropout rate of input neurons, E.g. p =0.9, dropping out 90% of input neurons. Default: 0.5 .

  • inplace (bool) – If set to True , will do this operation in-place. Default: False .

Inputs:
  • x (Tensor) - The input of Dropout.

Outputs:

Tensor, output tensor with the same shape as the x.

Raises
  • TypeError – If the dtype of p is not float.

  • ValueError – If length of shape of x is less than 1.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> from mindspore import Tensor, mint
>>> import numpy as np
>>> x = Tensor(np.ones([2, 2, 3]), mindspore.float32)
>>> net = mint.nn.Dropout(p=0.2)
>>> net.set_train()
>>> output = net(x)
>>> print(output.shape)
(2, 2, 3)