mindspore.nn.InstanceNorm1d

class mindspore.nn.InstanceNorm1d(num_features, eps=1e-05, momentum=0.1, affine=True, gamma_init='ones', beta_init='zeros')[source]

This layer applies Instance Normalization over a 3D input (a mini-batch of 1D inputs with additional channel dimension). Refer to the paper Instance Normalization: The Missing Ingredient for Fast Stylization. It rescales and recenters the feature using a mini-batch of data and the learned parameters which can be described in the following formula.

\[y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta\]

The size of \(\gamma\) and \(\beta\), learnable parameters vectors, is num_features if affine is True. The standard-deviation is calculated via the biased estimator.

This layer uses instance statistics computed from input data in both training and evaluation modes.

InstanceNorm1d and BatchNorm1d are very similar, but have some differences. InstanceNorm1d is applied on each channel of channeled data like RGB images, but BatchNorm1d is usually applied on each batch of batched data.

Note

Note that the formula for updating the running_mean and running_var is \(\hat{x}_\text{new} = (1 - \text{momentum}) \times x_t + \text{momentum} \times \hat{x}\), where \(\hat{x}\) is the estimated statistic and \(x_t\) is the new observed value.

Parameters
  • num_features (int) – C from an expected input of size \((N, C, L)\).

  • eps (float) – A value added to the denominator for numerical stability. Default: 1e-5 .

  • momentum (float) – A floating hyperparameter of the momentum for the running_mean and running_var computation. Default: 0.1 .

  • affine (bool) – A bool value. When set to True, gamma and beta can be learned. Default: True .

  • gamma_init (Union[Tensor, str, Initializer, numbers.Number]) – Initializer for the gamma weight. The values of str refer to the function initializer including 'zeros' , 'ones' , etc. When initialized with Tensor, the shape should be \((C)\). Default: 'ones' .

  • beta_init (Union[Tensor, str, Initializer, numbers.Number]) – Initializer for the beta weight. The values of str refer to the function initializer including 'zeros' , 'ones' , etc. When initialized with Tensor, the shape should be \((C)\). Default: 'zeros' .

Inputs:
  • x (Tensor) - Tensor of shape \((N, C, L)\). Data type: float16 or float32.

Outputs:

Tensor, the normalized, scaled, offset tensor, of shape \((N, C, L)\). Same type and shape as the x.

Raises
  • TypeError – If the type of num_features is not int.

  • TypeError – If the type of eps is not float.

  • TypeError – If the type of momentum is not float.

  • TypeError – If the type of affine is not bool.

  • TypeError – If the type of gamma_init/beta_init is not same, or if the initialized element type is not float32.

  • ValueError – If num_features is less than 1.

  • ValueError – If momentum is not in range [0, 1].

  • ValueError – If the shape of gamma_init / beta_init is not \((C)\).

  • KeyError – If any of gamma_init/beta_init is str and the homonymous class inheriting from Initializer not exists.

Supported Platforms:

GPU

Examples

>>> import mindspore as ms
>>> import numpy as np
>>> net = ms.nn.InstanceNorm1d(3)
>>> x = ms.Tensor(np.ones([2, 3, 5]), ms.float32)
>>> output = net(x)
>>> print(output.shape)
(2, 3, 5)