mindspore.nn.PReLU

class mindspore.nn.PReLU(channel=1, w=0.25)[source]

PReLU activation function.

Applies the PReLU function element-wise.

PReLU is defined as:

\[prelu(x_i)= \max(0, x_i) + w * \min(0, x_i),\]

where \(x_i\) is an element of an channel of the input.

Here \(w\) is a learnable parameter with a default initial value 0.25. Parameter \(w\) has dimensionality of the argument channel. If called without argument channel, a single parameter \(w\) will be shared across all channels.

The picture about PReLU looks like this PReLU.

Parameters
  • channel (int) – The dimension of input. Default: 1.

  • w (Union[float, list, Tensor]) – The initial value of w. Default: 0.25.

Inputs:
  • input_data (Tensor) - The input of PReLU with data type of float16 or float32.

Outputs:

Tensor, with the same type and shape as the input_data.

Raises
  • TypeError – If channel is not an int.

  • TypeError – If w is not one of float, list, Tensor.

  • TypeError – If dtype of input_data is neither float16 nor float32.

  • ValueError – If channel is less than 1.

  • ValueError – If length of shape of input_data is equal to 1.

Supported Platforms:

Ascend

Examples

>>> input_x = Tensor(np.array([[[[0.1, 0.6], [0.9, 0.9]]]]), mindspore.float32)
>>> prelu = nn.PReLU()
>>> output = prelu(input_x)
>>> print(output)
[[[[0.1 0.6]
   [0.9 0.9]]]]