mindspore.common.initializer
Initializer for cell parameters.
- class mindspore.common.initializer.Constant(value)[source]
Initialize a constant.
- Parameters
value (Union[int, numpy.ndarray]) – The value to initialize.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer >>> tensor1 = initializer(0, [1, 2, 3], mindspore.float32) >>> tensor2 = initializer(5, [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.HeNormal(negative_slope=0, mode="fan_in", nonlinearity="leaky_relu")[source]
Initialize the array with HeKaiming Normal algorithm, and from a normal distribution collect samples within \({N}(0, \text{sigma}^2)\) where
\[sigma = \frac{gain} {\sqrt{mode}}\]where \(gain\) is an optional scaling factor.
where \(mode\) is the number of input units or output units in the weight tensor.
For details of HeUniform algorithm, please check https://arxiv.org/abs/1502.01852.
- Parameters
negative_slope (int, float, bool) – The negative slope of the rectifier used after this layer (only used when nonlinearity is ‘leaky_relu’). Default: 0.
mode (str) – Either ‘fan_in’ or ‘fan_out’. Choosing ‘fan_in’ preserves the magnitude of the variance of the weights in the forward pass. Choosing ‘fan_out’ preserves the magnitudes in the backwards pass. Default: fan_in.
nonlinearity (str) – The non-linear function, recommended to use only with ‘relu’ or ‘leaky_relu’. Default: leaky_relu.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, HeNormal >>> tensor1 = initializer(HeNormal(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('he_normal', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.HeUniform(negative_slope=0, mode="fan_in", nonlinearity="leaky_relu")[source]
Initialize the array with HeKaiming Uniform algorithm, and from a uniform distribution collect samples within \({U}(-\text{boundary}, \text{boundary})\) where
\[boundary = \sqrt{\frac{6}{(1 + a^2) \times \text{fan_in}}}\]where \(-boundary\) the lower bound of the HeUniform distribution.
where \(boundary\) the upper bound of the HeUniform distribution.
For details of HeUniform algorithm, please check https://arxiv.org/abs/1502.01852.
- Parameters
negative_slope (int, float, bool) – The negative slope of the rectifier used after this layer (only used when nonlinearity is ‘leaky_relu’). Default: 0.
mode (str) – Either ‘fan_in’ or ‘fan_out’. Choosing ‘fan_in’ preserves the magnitude of the variance of the weights in the forward pass. Choosing ‘fan_out’ preserves the magnitudes in the backwards pass. Default: fan_in.
nonlinearity (str) – The non-linear function, recommended to use only with ‘relu’ or ‘leaky_relu’. Default: leaky_relu.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, HeUniform >>> tensor1 = initializer(HeUniform(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('he_uniform', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.Initializer(**kwargs)[source]
The base class of the initializer. Initialization of tensor basic attributes and model weight values.
- Parameters
kwargs (dict) – Keyword arguments for Initializer.
- class mindspore.common.initializer.Normal(sigma=0.01, mean=0.0)[source]
Initialize a normal array, and obtain values \({N}(\text{sigma}, \text{mean})\) from the normal distribution to fill the input tensor.
\[f(x) = \frac{1} {\sqrt{2*π} * sigma}exp(-\frac{(x - mean)^2} {2*{sigma}^2})\]- Parameters
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, Normal >>> tensor1 = initializer(Normal(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('normal', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.One(**kwargs)[source]
Fills the input array with the values one.
- Parameters
arr (Array) – The array to be assigned.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, One >>> tensor1 = initializer(One(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('ones', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.TruncatedNormal(sigma=0.01)[source]
Initialize a truncated normal distribution which is a bounded normal distribution within \({N}(\text{low}, \text{high})\).
- Parameters
sigma (float) – The sigma of the array. Default: 0.01.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, TruncatedNormal >>> tensor1 = initializer(TruncatedNormal(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('truncatedNormal', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.Uniform(scale=0.07)[source]
Initialize a uniform array, and obtain values \({U}(-\text{scale}, \text{scale})\) from the uniform distribution to fill the input tensor.
- Parameters
scale (float) – The scale of the array. Default: 0.07.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, Uniform >>> tensor1 = initializer(Uniform(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('uniform', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.XavierUniform(gain=1)[source]
Initialize the array with xavier uniform algorithm, and from a uniform distribution collect samples within \({U}(-\text{boundary}, \text{boundary})\) where:
\[boundary = gain * \sqrt{\frac{6}{n_{in} + n_{out}}}\]where \(gain\) is an optional scaling factor.
where \(n_{in}\) is the number of input units in the weight tensor.
where \(n_{out}\) is the number of output units in the weight tensor.
For details of XavierUniform algorithm, please check http://proceedings.mlr.press/v9/glorot10a.html.
- Parameters
gain (float) – An optional scaling factor. Default: 1.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, XavierUniform >>> tensor1 = initializer(XavierUniform(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('xavier_uniform', [1, 2, 3], mindspore.float32)
- class mindspore.common.initializer.Zero(**kwargs)[source]
Fills the input array with the values zero.
- Parameters
arr (Array) – The array to be assigned.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, Zero >>> tensor1 = initializer(Zero(), [1, 2, 3], mindspore.float32) >>> tensor2 = initializer('zeros', [1, 2, 3], mindspore.float32)
- mindspore.common.initializer.initializer(init, shape=None, dtype=mstype.float32)[source]
Create and initialize a tensor.
- Parameters
init (Union[Tensor, str, Initializer, numbers.Number]) –
Initialize value.
str: The init should be the alias of the class inheriting from Initializer and the corresponding class will be called. The value of ‘init’ can be “normal”, “ones” or “zeros”, etc.
Initializer: The init should be the class inheriting from Initializer to initialize tensor.
numbers.Number: The Constant will be called to initialize tensor.
shape (Union[tuple, list, int]) – A list of integers, a tuple of integers or an integer as the shape of output. Default: None.
dtype (
mindspore.dtype
) – The type of data in initialized tensor. Default: mindspore.float32.
- Returns
Union[Tensor], return is Tensor object.
Examples
>>> import mindspore >>> from mindspore.common.initializer import initializer, One >>> tensor1 = initializer('ones', [1, 2, 3], mindspore.float32) >>> tensor2 = initializer(One(), [1, 2, 3], mindspore.float32) >>> tensor3 = initializer(0, [1, 2, 3], mindspore.float32)