mindspore.mint.nn.functional.dropout
- mindspore.mint.nn.functional.dropout(input, p=0.5, training=True, inplace=False)[source]
During training, randomly zeroes some of the elements of the input tensor with probability p from a Bernoulli distribution. It plays the role of reducing neuron correlation and avoid overfitting. And the return will be multiplied by
during training. During the reasoning, this operation returns the same Tensor as the input.- Parameters
input (Tensor) – The input Tensor of shape
.p (float, optional) – The dropping rate of input neurons, between 0 and 1, e.g. p = 0.1, means dropping out 10% of input neurons. Default:
0.5
.training (bool, optional) – Apply dropout if it is
True
, if it isFalse
, the input is returned directly, and p is invalid. Default:True
.inplace (bool, optional) – If set to
True
, will do this operation in-place. Default:False
.
- Returns
output (Tensor) - Zeroed tensor, with the same shape and data type as input.
- Raises
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> from mindspore import Tensor, mint >>> input = Tensor(((20, 16), (50, 50)), mindspore.float32) >>> output = mint.nn.functional.dropout(input, p=0.5) >>> print(output.shape) (2, 2)