mindspore.ops.LayerNorm
- class mindspore.ops.LayerNorm(begin_norm_axis=1, begin_params_axis=1, epsilon=1e-07)[source]
Applies the Layer Normalization to the input tensor.
This operator will normalize the input tensor on given axis. LayerNorm is described in the paper Layer Normalization.
\[y = \frac{x - mean}{\sqrt{variance + \epsilon}} * \gamma + \beta\]where \(\gamma\) is scale, \(\beta\) is bias, \(\epsilon\) is epsilon.
- Parameters
begin_norm_axis (int) – The begin axis of the input_x to apply LayerNorm, the value must be in [-1, rank(input_x)). Default:
1
.begin_params_axis (int) – The begin axis of the parameter input (gamma, beta) to apply LayerNorm, the value must be in [-1, rank(input_x)). Default:
1
.epsilon (float) – A value added to the denominator for numerical stability(\(\epsilon\)). Default:
1e-7
.
- Inputs:
input_x (Tensor) - Tensor of shape \((N, \ldots)\). The input of LayerNorm. Supported dtypes: float16, float32, float64.
gamma (Tensor) - Learnable parameter \(\gamma\) . Tensor of shape input_x_shape[begin_params_axis:]. Supported dtypes: float16, float32, float64.
beta (Tensor) - Learnable parameter \(\beta\) . Tensor of shape input_x_shape[begin_params_axis:]. Supported dtypes: float16, float32, float64.
- Outputs:
tuple[Tensor], tuple of 3 tensors, the normalized input and the updated parameters.
output_x (Tensor) - The normalized input, has the same type and shape as the input_x.
mean (Tensor) - The first begin_norm_axis dimensions of mean shape is the same as input_x, and the remaining dimensions are 1. Suppose the shape of the input_x is \((x_1, x_2, \ldots, x_R)\), the shape of the mean is \((x_1, \ldots, x_{begin\_norm\_axis}, 1, \ldots, 1)\) (when begin_norm_axis=0, the shape of mean is \((1, \ldots, 1)\) ).
rstd (Tensor) - The reciprocal of the input standard deviation. Shape is the same as mean .
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> import mindspore >>> import numpy as np >>> from mindspore import Tensor, ops >>> input_x = Tensor(np.array([[1, 2, 3], [1, 2, 3]]), mindspore.float32) >>> gamma = Tensor(np.ones([3]), mindspore.float32) >>> beta = Tensor(np.ones([3]), mindspore.float32) >>> layer_norm = ops.LayerNorm() >>> output, _, _ = layer_norm(input_x, gamma, beta) >>> print(output) [[-0.2247448 1. 2.2247448] [-0.2247448 1. 2.2247448]]