mindspore.ops.LayerNorm
- class mindspore.ops.LayerNorm(begin_norm_axis=1, begin_params_axis=1, epsilon=1e-07)[source]
Applies the Layer Normalization to the input tensor.
This operator will normalize the input tensor on given axis. LayerNorm is described in the paper Layer Normalization.
\[y = \frac{x - mean}{\sqrt{variance + \epsilon}} * \gamma + \beta\]where \(\gamma\) is scale, \(\beta\) is bias, \(\epsilon\) is epsilon.
- Parameters
begin_norm_axis (int) – The begin axis of the input_x to apply LayerNorm, the value must be in [-1, rank(input)). Default: 1.
begin_params_axis (int) – The begin axis of the parameter input (gamma, beta) to apply LayerNorm, the value must be in [-1, rank(input)). Default: 1.
epsilon (float) – A value added to the denominator for numerical stability. Default: 1e-7.
- Inputs:
input_x (Tensor) - Tensor of shape \((N, \ldots)\). The input of LayerNorm.
gamma (Tensor) - Tensor of shape \((P_0, \ldots, P_\text{begin_params_axis})\). The learnable parameter gamma as the scale on norm.
beta (Tensor) - Tensor of shape \((P_0, \ldots, P_\text{begin_params_axis})\). The learnable parameter beta as the scale on norm.
- Outputs:
tuple[Tensor], tuple of 3 tensors, the normalized input and the updated parameters.
output_x (Tensor) - The normalized input, has the same type and shape as the input_x. The shape is \((N, C)\).
mean (Tensor) - Tensor of shape \((C,)\).
variance (Tensor) - Tensor of shape \((C,)\).
- Raises
- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> input_x = Tensor(np.array([[1, 2, 3], [1, 2, 3]]), mindspore.float32) >>> gamma = Tensor(np.ones([3]), mindspore.float32) >>> beta = Tensor(np.ones([3]), mindspore.float32) >>> layer_norm = ops.LayerNorm() >>> output, mean, variance = layer_norm(input_x, gamma, beta) >>> print(output) [[-0.2247448 1. 2.2247448] [-0.2247448 1. 2.2247448]] >>> print(mean) [[2.] [2.]] >>> print(variance) [[0.6666667] [0.6666667]]