mindspore.nn.LPPool1d

class mindspore.nn.LPPool1d(norm_type, kernel_size, stride=None, ceil_mode=False)[source]

Applying 1D LPPooling operation on an input Tensor can be regarded as forming a 1D input plane.

Typically the input is of shape (Nin,Cin,Lin) or (Cin,Lin), the output is of shape (Nout,Cout,Lout) or (Cout,Lout), with the same shape as input, the operation is as follows.

f(X)=xXxpp

Note

This interface currently does not support Atlas A2 training series products.

Parameters
  • norm_type (Union[int, float]) –

    Type of normalization, represents p in the formula, can not be 0.

    • if p = 1, the result is the sum of the elements within the pooling kernel(proportional to average pooling).

    • if p = , the result is the result of maximum pooling.

  • kernel_size (int) – The size of kernel window.

  • stride (int) – The distance of kernel moving, an int number that represents the width of movement is stride, if the value is None, the default value kernel_size is used. Default: None .

  • ceil_mode (bool) – If True, use ceil to calculate output shape. If False, use ceil to calculate output shape. Default: False .

Inputs:
  • x (Tensor) - Tensor of shape (Nin,Cin,Lin) or (Cin,Lin).

Outputs:
  • output (Tensor) - LPPool1d result, with shape (Nout,Cout,Lout) or (Cout,Lout), it has the same data type as x, where

Lout=Linkernel_sizestride+1
Raises
  • TypeError – If x is not a Tensor.

  • TypeError – If kernel_size or stride is not an int.

  • TypeError – If ceil_mode is not a bool.

  • TypeError – If norm_type is neither float nor int.

  • ValueError – If norm_type is equal to 0.

  • ValueError – If kernel_size or stride is less than 1.

  • ValueError – If length of shape of x is not equal to 2 or 3.

Supported Platforms:

Ascend GPU CPU

Examples

>>> import mindspore as ms
>>> import numpy as np
>>> a = ms.Tensor(np.arange(2 * 3 * 4).reshape((2, 3, 4)), dtype=ms.float32)
>>> net = ms.nn.LPPool1d(norm_type=1, kernel_size=3, stride=1)
>>> out = net(a)
>>> print(out)
[[[ 3.  6.]
  [15. 18.]
  [27. 30.]]
 [[39. 42.]
  [51. 54.]
  [63. 66.]]]