mindspore.mint.nn.functional.dropout

View Source On Gitee
mindspore.mint.nn.functional.dropout(input, p=0.5, training=True, inplace=False)[source]

During training, randomly zeroes some of the elements of the input tensor with probability p from a Bernoulli distribution. It plays the role of reducing neuron correlation and avoid overfitting. And the return will be multiplied by 11p during training. During the reasoning, this operation returns the same Tensor as the input.

Parameters
  • input (Tensor) – The input Tensor of shape (,N).

  • p (float, optional) – The dropping rate of input neurons, between 0 and 1, e.g. p = 0.1, means dropping out 10% of input neurons. Default: 0.5 .

  • training (bool, optional) – Apply dropout if it is True , if it is False , the input is returned directly, and p is invalid. Default: True .

  • inplace (bool, optional) – If set to True , will do this operation in-place. Default: False .

Returns

  • output (Tensor) - Zeroed tensor, with the same shape and data type as input.

Raises
Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> from mindspore import Tensor, mint
>>> input = Tensor(((20, 16), (50, 50)), mindspore.float32)
>>> output = mint.nn.functional.dropout(input, p=0.5)
>>> print(output.shape)
(2, 2)