mindspore.ops.layer_norm
- mindspore.ops.layer_norm(input, normalized_shape, weight=None, bias=None, eps=1e-05)[source]
Applies the Layer Normalization to the input tensor.
This operator will normalize the input tensor on given axis. LayerNorm is described in the paper Layer Normalization.
\[y = \frac{x - mean}{\sqrt{variance + \epsilon}} * \gamma + \beta\]where \(\gamma\) is weight, \(\beta\) is bias, \(\epsilon\) is eps.
- Parameters
input (Tensor) – Tensor of shape \((N, \ldots)\). The input of LayerNorm.
normalized_shape (Union(int, tuple[int], list[int])) – The normalized shape of input for LayerNorm. normalized_shape equal to input_shape[begin_norm_axis:], where begin_norm_axis represents the axis where normalization begins.
weight (Tensor, optional) – Learnable parameter \(\gamma\) . Tensor of shape normalized_shape. Default:
None
, has the same data type with input. Initialized to1
when weight is None.bias (Tensor, optional) – Learnable parameter \(\beta\) . Tensor of shape normalized_shape. Default:
None
, has the same data type with input. Initialized to0
when bias is None.eps (float, optional) – A value added to the denominator for numerical stability(\(\epsilon\)). Default:
1e-5
.
- Returns
output (Tensor) - The normalized input, has the same type and shape as the input.
- Raises
- Supported Platforms:
Ascend
Examples
>>> import mindspore >>> import numpy as np >>> from mindspore import Tensor, ops >>> input_x = Tensor(np.array([[1, 2, 3], [1, 2, 3]]), mindspore.float32) >>> normalized_shape = (3,) >>> gamma = Tensor(np.ones(normalized_shape), mindspore.float32) >>> beta = Tensor(np.zeros(normalized_shape), mindspore.float32) >>> eps = 1e-7 >>> output = ops.layer_norm(input_x, normalized_shape, gamma, beta, eps) >>> print(output) [[-1.2247448 0. 1.2247448] [-1.2247448 0. 1.2247448]]