mindspore.ops.PReLU
- class mindspore.ops.PReLU(*args, **kwargs)[source]
Parametric Rectified Linear Unit activation function.
PReLU is described in the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. Defined as follows:
\[prelu(x_i)= \max(0, x_i) + \min(0, w * x_i),\]where \(x_i\) is an element of an channel of the input.
Note
1-dimensional input_x is not supported.
- Inputs:
input_x (Tensor) - Float tensor, representing the output of the preview layer. With data type of float16 or float32.
weight (Tensor) - Float Tensor, w > 0, there are only two shapes are legitimate, 1 or the number of channels of the input. With data type of float16 or float32.
- Outputs:
Tensor, with the same type as input_x.
For detailed information, please refer to nn.PReLU.
- Raises
TypeError – If dtype of input_x or weight is neither float16 nor float32.
TypeError – If input_x or weight is not a Tensor.
ValueError – If length of shape of input_x is equal to 1.
ValueError – If length of shape of weight is not equal to 1.
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> import mindspore.nn as nn >>> import numpy as np >>> from mindspore import Tensor >>> from mindspore.ops import operations as ops >>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self.prelu = ops.PReLU() ... def construct(self, input_x, weight): ... result = self.prelu(input_x, weight) ... return result ... >>> input_x = Tensor(np.random.randint(-3, 3, (2, 3, 2)), mindspore.float32) >>> weight = Tensor(np.array([0.1, 0.6, -0.3]), mindspore.float32) >>> net = Net() >>> output = net(input_x, weight) >>> print(output.shape) (2, 3, 2)