Document feedback

Question document fragment

When a question document fragment contains a formula, it is displayed as a space.

Submission type
issue

It's a little complicated...

I'd like to ask someone.

Please select the submission type

Problem type
Specifications and Common Mistakes

- Specifications and Common Mistakes:

- Misspellings or punctuation mistakes,incorrect formulas, abnormal display.

- Incorrect links, empty cells, or wrong formats.

- Chinese characters in English context.

- Minor inconsistencies between the UI and descriptions.

- Low writing fluency that does not affect understanding.

- Incorrect version numbers, including software package names and version numbers on the UI.

Usability

- Usability:

- Incorrect or missing key steps.

- Missing main function descriptions, keyword explanation, necessary prerequisites, or precautions.

- Ambiguous descriptions, unclear reference, or contradictory context.

- Unclear logic, such as missing classifications, items, and steps.

Correctness

- Correctness:

- Technical principles, function descriptions, supported platforms, parameter types, or exceptions inconsistent with that of software implementation.

- Incorrect schematic or architecture diagrams.

- Incorrect commands or command parameters.

- Incorrect code.

- Commands inconsistent with the functions.

- Wrong screenshots.

- Sample code running error, or running results inconsistent with the expectation.

Risk Warnings

- Risk Warnings:

- Lack of risk warnings for operations that may damage the system or important data.

Content Compliance

- Content Compliance:

- Contents that may violate applicable laws and regulations or geo-cultural context-sensitive words and expressions.

- Copyright infringement.

Please select the type of question

Problem description

Describe the bug so that we can quickly locate the problem.

mindspore.mint.nn.functional.layer_norm

mindspore.mint.nn.functional.layer_norm(input, normalized_shape, weight=None, bias=None, eps=1e-5)[source]

Applies the Layer Normalization on the mini-batch input.

Layer normalization is widely used in recurrent neural networks. Apply normalization to the mini-batch input of a single training case. LayerNorm is described in the paper Layer Normalization.

Unlike batch normalization, layer normalization performs the exact same calculations at training and test time. Applies to all channels and pixels, even batch_size=1. The formula is as follows:

y=xE[x]Var[x]+ϵγ+β

where γ is the weight value learned through training, β is the bias value learned through training.

Parameters
  • input (Tensor) – The shape of input is (N, *), where * represents any additional dimension.

  • normalized_shape (Union(int, tuple[int], list[int])) – The normalized shape of input for LayerNorm. normalized_shape equal to input_shape[begin_norm_axis:], where begin_norm_axis represents the axis where normalization begins.

  • weight (Tensor, optional) – Learnable parameter γ . Tensor of shape normalized_shape. Default: None, has the same data type with input. Initialized to 1 when weight is None.

  • bias (Tensor, optional) – Learnable parameter β . Tensor of shape normalized_shape. Default: None, has the same data type with input. Initialized to 0 when bias is None.

  • eps (float, optional) – A value added to the denominator for numerical stability(ϵ). Default: 1e-5 .

Returns

Tensor. The normalized tensor, has the same type and shape as the input.

Raises
  • TypeError – If input is not a Tensor.

  • TypeError – If normalized_shape is not an integer, a list or a tuple.

  • TypeError – If eps is not a float.

Supported Platforms:

Ascend

Examples

>>> import mindspore
>>> import numpy as np
>>> from mindspore import Tensor, mint
>>> input_x = Tensor(np.array([[1, 2, 3], [1, 2, 3]]), mindspore.float32)
>>> normalized_shape = (3,)
>>> gamma = Tensor(np.ones(normalized_shape), mindspore.float32)
>>> beta = Tensor(np.zeros(normalized_shape), mindspore.float32)
>>> eps = 1e-7
>>> output = mint.nn.functional.layer_norm(input_x, normalized_shape, gamma, beta, eps)
>>> print(output)
[[-1.2247448 0. 1.2247448]
 [-1.2247448 0. 1.2247448]]