mindspore.ops.PReLU
- class mindspore.ops.PReLU[source]
Parametric Rectified Linear Unit activation function.
PReLU is described in the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. Defined as follows:
\[prelu(x_i)= \max(0, x_i) + \min(0, w * x_i),\]where \(x_i\) is an element of an channel of the input, w is the weight of the channel.
Note
0-D or 1-D input_x is not supported on Ascend.
- Inputs:
x (Tensor) - The first input tensor, representing the output of the preview layer. With data type of float16 or float32. The shape is \((N, C, *)\) where \(*\) means, any number of additional dimensions.
weight (Tensor) - The second input tensor. The data type is float16 or float32. There are only two shapes are legitimate, 1 or the number of channels of the input_x. Channel dim is the 2nd dim of input. When input is 0-D or 1-D tensor, the number of channels is 1.
- Outputs:
Tensor, with the same type as x.
For detailed information, please refer to
nn.PReLU
.- Raises
TypeError – If dtype of x or weight is neither float16 nor float32.
TypeError – If the x or the weight is not a Tensor.
ValueError – If the x is a 0-D or 1-D Tensor on Ascned.
ValueError – If the weight is not a 1-D Tensor.
- Supported Platforms:
Ascend
GPU
Examples
>>> class Net(nn.Cell): ... def __init__(self): ... super(Net, self).__init__() ... self.prelu = ops.PReLU() ... def construct(self, x, weight): ... result = self.prelu(x, weight) ... return result ... >>> x = Tensor(np.arange(-6, 6).reshape((2, 3, 2)), mindspore.float32) >>> weight = Tensor(np.array([0.1, 0.6, -0.3]), mindspore.float32) >>> net = Net() >>> output = net(x, weight) >>> print(output) [[[-0.60 -0.50] [-2.40 -1.80] [ 0.60 0.30]] [[ 0.00 1.00] [ 2.00 3.00] [ 4.0 5.00]]]