mindspore.mint.nn.functional.layer_norm

mindspore.mint.nn.functional.layer_norm(input, normalized_shape, weight=None, bias=None, eps=1e-05)[source]

Applies the Layer Normalization on the mini-batch input.

Layer normalization is widely used in recurrent neural networks. Apply normalization to the mini-batch input of a single training case. LayerNorm is described in the paper Layer Normalization.

Unlike batch normalization, layer normalization performs the exact same calculations at training and test time. Applies to all channels and pixels, even batch_size=1. The formula is as follows:

\[y = \frac{x - \mathrm{E}[x]}{\sqrt{\mathrm{Var}[x] + \epsilon}} * \gamma + \beta\]

where \(\gamma\) is the weight value learned through training, \(\beta\) is the bias value learned through training.

Parameters
  • input (Tensor) – The shape of input is (N, *), where * represents any additional dimension.

  • normalized_shape (Union(int, tuple[int], list[int])) – The normalized shape of input for LayerNorm. normalized_shape equal to input_shape[begin_norm_axis:], where begin_norm_axis represents the axis where normalization begins.

  • weight (Tensor, optional) – Learnable parameter \(\gamma\) . Tensor of shape normalized_shape. Default: None, has the same data type with input. Initialized to 1 when weight is None.

  • bias (Tensor, optional) – Learnable parameter \(\beta\) . Tensor of shape normalized_shape. Default: None, has the same data type with input. Initialized to 0 when bias is None.

  • eps (float, optional) – A value added to the denominator for numerical stability(\(\epsilon\)). Default: 1e-5 .

Returns

Tensor. The normalized tensor, has the same type and shape as the input.

Raises
  • TypeError – If input is not a Tensor.

  • TypeError – If normalized_shape is not an integer, a list or a tuple.

  • TypeError – If eps is not a float.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, mint
>>> input_x = Tensor(np.array([[1, 2, 3], [1, 2, 3]]), mindspore.float32)
>>> normalized_shape = (3,)
>>> gamma = Tensor(np.ones(normalized_shape), mindspore.float32)
>>> beta = Tensor(np.zeros(normalized_shape), mindspore.float32)
>>> eps = 1e-7
>>> output = mint.nn.functional.layer_norm(input_x, normalized_shape, gamma, beta, eps)
>>> print(output)
[[-1.2247448 0. 1.2247448]
 [-1.2247448 0. 1.2247448]]