mindspore.ops.rrelu

mindspore.ops.rrelu(input, lower=1.0 / 8, upper=1.0 / 3)[source]

Randomized Leaky ReLU activation function.

The activation function is defined as:

rrelu(inputji)={inputji,if inputji0;αjiinputji,otherwise.

where αji ~ U(l,u), lu.

Applies the rrelu function elementally, as described in the paper: Empirical Evaluation of Rectified Activations in Convolution Network .

Parameters
  • input (Tensor) – The input of rrelu is a Tensor of any dimension.

  • lower (Union[int, float]) – Slope of the activation function at x < 0. Default: 1.0 / 8 .

  • upper (Union[int, float]) – Slope of the activation function at x < 0. Default: 1.0 / 3 .

Returns

Tensor, after rrelu, has the same type and shape as the input.

Raises
  • TypeError – If lower is not a float or an int.

  • TypeError – If upper is not a float or an int.

  • TypeError – If input is not a Tensor.

  • TypeError – If input is not a Tensor of mindspore.float16 or mindspore.float32.

  • ValueError – If lower is greater than upper.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, ops
>>> x = Tensor(np.array([[-1.0, 4.0], [2.0, 0]]), mindspore.float32)
>>> output = ops.rrelu(x)
>>> print(output)
[[-0.31465699  4.        ]
 [ 2.          0.        ]]