mindspore.mint

mindspore.mint provides a large number of functional, nn, optimizer interfaces. The API usages and functions are consistent with the mainstream usage in the industry for easy reference. The mint interface is currently an experimental interface and performs better than ops in graph mode of O0 and PyNative mode. Currently, the graph sinking mode and CPU/GPU backend are not supported, and it will be gradually improved in the future.

The module import method is as follows:

from mindspore import mint

Compared with the previous version, the added, deleted and supported platforms change information of mindspore.mint operators in MindSpore, please refer to the link mindspore.mint API Interface Change .

Tensor

Creation Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.arange

Creates a sequence of numbers that begins at start and extends by increments of step up to but not including end.

Ascend

None

mindspore.mint.eye

Creates a tensor with ones on the diagonal and zeros in the rest.

Ascend

None

mindspore.mint.full

Create a Tensor of the specified shape and fill it with the specified value.

Ascend

None

mindspore.mint.linspace

Returns a Tensor whose value is steps evenly spaced in the interval start and end (including start and end), and the length of the output Tensor is steps.

Ascend

Atlas training series does not support int16 dtype currently.

mindspore.mint.ones

Creates a tensor filled with value ones.

Ascend

None

mindspore.mint.ones_like

Creates a tensor filled with 1, with the same shape as input, and its data type is determined by the given dtype.

Ascend

None

mindspore.mint.zeros

Creates a tensor filled with 0 with shape described by size and fills it with value 0 in type of dtype.

Ascend

None

mindspore.mint.zeros_like

Creates a tensor filled with 0, with the same size as input.

Ascend

None

Indexing, Slicing, Joining, Mutating Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.cat

Connect input tensors along with the given dimension.

Ascend

None

mindspore.mint.gather

Gather data from a tensor by indices.

Ascend

On Ascend, the behavior is unpredictable in the following cases: the value of index is not in the range [-input.shape[dim], input.shape[dim]) in forward; the value of index is not in the range [0, input.shape[dim]) in backward.

mindspore.mint.index_select

Generates a new Tensor that accesses the values of input along the specified dim dimension using the indices specified in index.

Ascend

None

mindspore.mint.masked_select

Returns a new 1-D Tensor which indexes the input tensor according to the boolean mask.

Ascend

None

mindspore.mint.permute

Permutes the dimensions of the input tensor according to input dims .

Ascend

None

mindspore.mint.scatter

Update the value in src to input according to the specified index.

Ascend

None

mindspore.mint.scatter_add

Add all elements in src to the index specified by index to input along dimension specified by dim.

Ascend

None

mindspore.mint.split

Splits the Tensor into chunks along the given dim.

Ascend

None

mindspore.mint.narrow

Returns a narrowed tensor from input tensor, and the dimension axis is input from start to start + length.

Ascend

None

mindspore.mint.nonzero

Return the positions of all non-zero values.

Ascend

None

mindspore.mint.tile

Creates a new tensor by replicating input dims times.

Ascend

None

mindspore.mint.tril

Returns the lower triangle part of input (elements that contain the diagonal and below), and set the other elements to zeros.

Ascend

None

mindspore.mint.stack

Stacks a list of tensors in specified dim.

Ascend

None

mindspore.mint.where

Selects elements from input or other based on condition and returns a tensor.

Ascend

None

Random Sampling

API Name

Description

Supported Platforms

Warning

mindspore.mint.multinomial

Returns a tensor sampled from the multinomial probability distribution located in the corresponding row of the input tensor.

Ascend

This is an experimental API that is subject to change or deletion.

mindspore.mint.normal

Generates random numbers according to the standard Normal (or Gaussian) random number distribution.

Ascend

None

mindspore.mint.rand_like

Returns a new tensor that fills numbers from the uniform distribution over an interval \([0, 1)\) based on the given dtype and shape of the input tensor.

Ascend

None

mindspore.mint.rand

Returns a new tensor that fills numbers from the uniform distribution over an interval \([0, 1)\) based on the given shape and dtype.

Ascend

None

Math Operations

Pointwise Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.abs

Returns absolute value of a tensor element-wise.

Ascend

None

mindspore.mint.add

Adds scaled other value to input Tensor.

Ascend

None

mindspore.mint.acos

Computes arccosine of input tensors element-wise.

Ascend

None

mindspore.mint.acosh

Computes inverse hyperbolic cosine of the inputs element-wise.

Ascend

None

mindspore.mint.arccos

Alias for mindspore.mint.acos() .

Ascend

None

mindspore.mint.arccosh

Alias for mindspore.mint.acosh().

Ascend

None

mindspore.mint.arcsin

Alias for mindspore.mint.asin().

Ascend

None

mindspore.mint.arcsinh

Alias for mindspore.mint.asinh().

Ascend

None

mindspore.mint.arctan

Alias for mindspore.mint.atan().

Ascend

None

mindspore.mint.arctan2

Alias for mindspore.mint.atan2().

Ascend

None

mindspore.mint.arctanh

Alias for mindspore.mint.atanh().

Ascend

None

mindspore.mint.asin

Computes arcsine of input tensors element-wise.

Ascend

None

mindspore.mint.asinh

Computes inverse hyperbolic sine of the input element-wise.

Ascend

None

mindspore.mint.atan

Computes the trigonometric inverse tangent of the input element-wise.

Ascend

None

mindspore.mint.atan2

Returns arctangent of input/other element-wise.

Ascend

None

mindspore.mint.atanh

Computes inverse hyperbolic tangent of the input element-wise.

Ascend

None

mindspore.mint.bitwise_and

Returns bitwise and of two tensors element-wise.

Ascend

None

mindspore.mint.bitwise_or

Returns bitwise or of two tensors element-wise.

Ascend

None

mindspore.mint.bitwise_xor

Returns bitwise xor of two tensors element-wise.

Ascend

None

mindspore.mint.ceil

Rounds a tensor up to the closest integer element-wise.

Ascend

None

mindspore.mint.clamp

Ascend

None

mindspore.mint.cos

Computes cosine of input element-wise.

Ascend

Using float64 may cause a problem of missing precision.

mindspore.mint.cosh

Computes hyperbolic cosine of input element-wise.

Ascend

None

mindspore.mint.cross

Computes the cross product of input and other in dimension dim.

Ascend

None

mindspore.mint.div

Divides the first input tensor by the second input tensor in floating-point type element-wise.

Ascend

None

mindspore.mint.divide

Alias for mindspore.mint.div() .

Ascend

None

mindspore.mint.erf

Computes the Gauss error function of input element-wise.

Ascend

None

mindspore.mint.erfc

Computes the complementary error function of input element-wise.

Ascend

None

mindspore.mint.erfinv

Returns the result of the inverse error function with input.

Ascend

None

mindspore.mint.exp

Returns exponential of a tensor element-wise.

Ascend

None

mindspore.mint.expm1

Returns exponential then minus 1 of a tensor element-wise.

Ascend

None

mindspore.mint.fix

Alias for mindspore.mint.trunc() .

Ascend

None

mindspore.mint.floor

Rounds a tensor down to the closest integer element-wise.

Ascend

None

mindspore.mint.log

Returns the natural logarithm of a tensor element-wise.

Ascend

If the input value of operator Log is within the range (0, 0.01] or [0.95, 1.05], the output accuracy may be affacted.

mindspore.mint.log1p

Returns the natural logarithm of one plus the input tensor element-wise.

Ascend

None

mindspore.mint.logical_and

Computes the "logical AND" of two tensors element-wise.

Ascend

None

mindspore.mint.logical_not

Computes the "logical NOT" of a tensor element-wise.

Ascend

None

mindspore.mint.logical_or

Computes the "logical OR" of two tensors element-wise.

Ascend

None

mindspore.mint.logical_xor

Computes the "logical XOR" of two tensors element-wise.

Ascend

None

mindspore.mint.mul

Multiplies two tensors element-wise.

Ascend

None

mindspore.mint.nan_to_num

Replace the NaN, positive infinity and negative infinity values in input with the specified values in nan, posinf and neginf respectively.

Ascend

For Ascend, it is only supported on Atlas A2 Training Series Products. This is an experimental API that is subject to change or deletion.

mindspore.mint.neg

Returns a tensor with negative values of the input tensor element-wise.

Ascend

None

mindspore.mint.negative

Alias for mindspore.mint.neg() .

Ascend

None

mindspore.mint.pow

Calculates the exponent power of each element in input.

Ascend

This is an experimental API that is subject to change or deletion.

mindspore.mint.reciprocal

Returns reciprocal of a tensor element-wise.

Ascend

None

mindspore.mint.remainder

Computes the remainder of input divided by other element-wise.

Ascend

None

mindspore.mint.roll

Rolls the elements of a tensor along an axis.

Ascend

None

mindspore.mint.round

Returns half to even of a tensor element-wise.

Ascend

None

mindspore.mint.rsqrt

Computes reciprocal of square root of input tensor element-wise.

Ascend

None

mindspore.mint.sigmoid

Computes Sigmoid of input element-wise.

Ascend

None

mindspore.mint.sign

Returns an element-wise indication of the sign of a number.

Ascend

None

mindspore.mint.sin

Computes sine of the input element-wise.

Ascend

None

mindspore.mint.sinc

Computes the normalized sinc of input.

Ascend

None

mindspore.mint.sinh

Computes hyperbolic sine of the input element-wise.

Ascend

None

mindspore.mint.sqrt

Returns sqrt of a tensor element-wise.

Ascend

None

mindspore.mint.square

Returns square of a tensor element-wise.

Ascend

None

mindspore.mint.sub

Subtracts scaled other value from input Tensor.

Ascend

None

mindspore.mint.tan

Computes tangent of input element-wise.

Ascend

None

mindspore.mint.tanh

Computes hyperbolic tangent of input element-wise.

Ascend

None

mindspore.mint.trunc

Returns a new tensor with the truncated integer values of the elements of the input tensor.

Ascend

None

mindspore.mint.xlogy

Computes the first input multiplied by the logarithm of second input element-wise.

Ascend

None

Reduction Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.argmax

Return the indices of the maximum values of a tensor across a dimension.

Ascend

None

mindspore.mint.argmin

Return the indices of the minimum values of a tensor across a dimension.

Ascend

None

mindspore.mint.all

Reduces a dimension of input by the "logical AND" of all elements in the dimension, by default.

Ascend

None

mindspore.mint.any

Reduces a dimension of input by the "logical OR" of all elements in the dimension, by default.

Ascend

None

mindspore.mint.max

Calculates the maximum value along with the given dimension for the input tensor.

Ascend

None

mindspore.mint.mean

Reduces all dimension of a tensor by averaging all elements in the dimension, by default.

Ascend

None

mindspore.mint.median

Output the median on the specified dimension dim and its corresponding index.

Ascend

None

mindspore.mint.min

Calculates the minimum value along with the given dimension for the input tensor.

Ascend

None

mindspore.mint.prod

Reduces a dimension of a tensor by multiplying all elements in the dimension, by default.

Ascend

None

mindspore.mint.sum

Calculate sum of Tensor elements over a given dim.

Ascend

None

mindspore.mint.unique

Returns the unique elements of input tensor.

Ascend

None

Comparison Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.eq

Computes the equivalence between two tensors element-wise.

Ascend

None

mindspore.mint.greater

Compare the value of the input parameters \(input > other\) element-wise, and the output result is a bool value.

Ascend

None

mindspore.mint.greater_equal

Given two Tensors, compares them element-wise to check if each element in the self Tensor is greater than or equal to the corresponding element in the second Tensor.

Ascend

None

mindspore.mint.gt

Compare the value of the input parameters \(input,other\) element-wise, and the output result is a bool value.

Ascend

None

mindspore.mint.isclose

Returns a new Tensor with boolean elements representing if each element of input is “close” to the corresponding element of other.

Ascend

None

mindspore.mint.isfinite

Determine which elements are finite for each position.

Ascend

None

mindspore.mint.le

Computes the boolean value of \(input <= other\) element-wise.

Ascend

None

mindspore.mint.less

Computes the boolean value of \(input < other\) element-wise.

Ascend

None

mindspore.mint.less_equal

Computes the boolean value of \(input <= other\) element-wise.

Ascend

None

mindspore.mint.lt

Alias for mindspore.mint.less() .

Ascend

None

mindspore.mint.maximum

Computes the maximum of input tensors element-wise.

Ascend

If all inputs are scalar of integers. In GRAPH mode, the output will be Tensor of int32, while in PYNATIVE mode, the output will be Tensor of int64.

mindspore.mint.minimum

Computes the minimum of input tensors element-wise.

Ascend

None

mindspore.mint.ne

Computes the non-equivalence of two tensors element-wise.

Ascend

None

mindspore.mint.topk

Finds values and indices of the k largest or smallest entries along a given dimension.

Ascend

If sorted is set to False, due to different memory layout and traversal methods on different platforms, the display order of calculation results may be inconsistent when sorted is False.

mindspore.mint.sort

Sorts the elements of the input tensor along the given dimension in the specified order.

Ascend

Currently, the data types of float16, uint8, int8, int16, int32, int64 are well supported. If use float32, it may cause loss of accuracy.

BLAS and LAPACK Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.baddbmm

The result is the sum of the input and a batch matrix-matrix product of matrices in batch1 and batch2.

Ascend

None

mindspore.mint.bmm

Performs batch matrix-matrix multiplication of two three-dimensional tensors.

Ascend

None

mindspore.mint.inverse

Compute the inverse of the input matrix.

Ascend

None

mindspore.mint.matmul

Returns the matrix product of two tensors.

Ascend

None

mindspore.mint.trace

Returns a new tensor that is the sum of the input main trace.

Ascend

None

Other Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.broadcast_to

Broadcasts input tensor to a given shape.

Ascend

None

mindspore.mint.cummax

Returns a tuple (values, indices) where values is the cumulative maximum value of input Tensor input along the dimension dim, and indices is the index location of each maximum value.

Ascend

None

mindspore.mint.cummin

Returns a tuple (values, indices) where values is the cumulative minimum value of input Tensor input along the dimension dim, and indices is the index location of each minimum value.

Ascend

None

mindspore.mint.cumsum

Computes the cumulative sum of input Tensor along dim.

Ascend

None

mindspore.mint.flatten

Flatten a tensor along dimensions from start_dim to end_dim.

Ascend

None

mindspore.mint.flip

Reverses the order of elements in a tensor along the given axis.

Ascend

None

mindspore.mint.repeat_interleave

Repeat elements of a tensor along an axis, like numpy.repeat.

Ascend

Only support on Atlas A2 training series.

mindspore.mint.searchsorted

Return the position indices such that after inserting the values into the sorted_sequence, the order of innermost dimension of the sorted_sequence remains unchanged.

Ascend

None

mindspore.mint.tril

Returns the lower triangle part of input (elements that contain the diagonal and below), and set the other elements to zeros.

Ascend

None

mindspore.mint.nn

Loss Functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.L1Loss

L1Loss is used to calculate the mean absolute error between the predicted value and the target value.

Ascend

None

Convolution Layers

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.Fold

Combines an array of sliding local blocks into a large containing tensor.

Ascend

None

mindspore.mint.nn.Unfold

Extracts sliding local blocks from a batched input tensor.

Ascend

None

Normalization Layers

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.GroupNorm

Group Normalization over a mini-batch of inputs.

Ascend

None

Non-linear Activations (weighted sum, nonlinearity)

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.GELU

Activation function GELU (Gaussian Error Linear Unit).

Ascend

None

mindspore.mint.nn.Hardshrink

Applies Hard Shrink activation function element-wise.

Ascend

None

mindspore.mint.nn.Hardsigmoid

Applies Hard Sigmoid activation function element-wise.

Ascend

None

mindspore.mint.nn.Hardswish

Applies Hard Swish activation function element-wise.

Ascend

None

mindspore.mint.nn.LogSoftmax

Applies the Log Softmax function to the input tensor on the specified axis.

Ascend

None

mindspore.mint.nn.Mish

Computes MISH (A Self Regularized Non-Monotonic Neural Activation Function) of input tensors element-wise.

Ascend

None

mindspore.mint.nn.PReLU

Applies PReLU activation function element-wise.

Ascend

None

mindspore.mint.nn.ReLU

Applies ReLU (Rectified Linear Unit activation function) element-wise.

Ascend

None

mindspore.mint.nn.SELU

Activation function SELU (Scaled exponential Linear Unit).

Ascend

None

mindspore.mint.nn.Softmax

Applies the Softmax function to an n-dimensional input Tensor.

Ascend

None

mindspore.mint.nn.Softshrink

Applies the SoftShrink function element-wise.

Ascend

None

Linear Layers

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.Linear

The linear connected layer.

Ascend

In PYNATIVE mode, if bias is False , the x cannot be greater than 6D.

Dropout Layers

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.Dropout

Dropout layer for the input.

Ascend

None

Pooling Layers

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.AvgPool2d

Applies a 2D average pooling over an input Tensor which can be regarded as a composition of 2D input planes.

Ascend

None

Loss Functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.BCEWithLogitsLoss

Adds sigmoid activation function to input as logits, and uses this logits to compute binary cross entropy between the logits and the target.

Ascend

None

mindspore.mint.nn.MSELoss

Calculates the mean squared error between the predicted value and the label value.

Ascend

None

mindspore.mint.nn.functional

Convolution functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.fold

Combines an array of sliding local blocks into a large containing tensor.

Ascend

Currently, only unbatched(3D) or batched(4D) image-like output tensors are supported.

mindspore.mint.nn.functional.unfold

Extracts sliding local blocks from a batched input tensor.

Ascend

Currently, batched(4D) image-like tensors are supported. For Ascend, it is only supported on platforms above Atlas A2.

Pooling functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.avg_pool2d

Applies a 2D average pooling over an input Tensor which can be regarded as a composition of 2D input planes.

Ascend

None

mindspore.mint.nn.functional.max_pool2d

Performs a 2D max pooling on the input Tensor.

Ascend

Only support on Atlas A2 training series.

Non-linear activation functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.batch_norm

Batch Normalization for input data and updated parameters.

Ascend

None

mindspore.mint.nn.functional.elu

Exponential Linear Unit activation function.

Ascend

None

mindspore.mint.nn.functional.gelu

Gaussian Error Linear Units activation function.

Ascend

None

mindspore.mint.nn.functional.group_norm

Group Normalization over a mini-batch of inputs.

Ascend

None

mindspore.mint.nn.functional.hardshrink

Hard Shrink activation function.

Ascend

None

mindspore.mint.nn.functional.hardsigmoid

Hard Sigmoid activation function.

Ascend

None

mindspore.mint.nn.functional.hardswish

Hard Swish activation function.

Ascend

None

mindspore.mint.nn.functional.layer_norm

Applies the Layer Normalization on the mini-batch input.

Ascend

None

mindspore.mint.nn.functional.leaky_relu

leaky_relu activation function.

Ascend

None

mindspore.mint.nn.functional.log_softmax

Applies the Log Softmax function to the input tensor on the specified axis.

Ascend

None

mindspore.mint.nn.functional.mish

Computes MISH (A Self Regularized Non-Monotonic Neural Activation Function) of input tensors element-wise.

Ascend

None

mindspore.mint.nn.functional.prelu

Parametric Rectified Linear Unit activation function.

Ascend

None

mindspore.mint.nn.functional.relu

Computes ReLU (Rectified Linear Unit activation function) of input tensors element-wise.

Ascend

None

mindspore.mint.nn.functional.selu

Activation function SELU (Scaled exponential Linear Unit).

Ascend

None

mindspore.mint.nn.functional.sigmoid

Computes Sigmoid of input element-wise.

Ascend

None

mindspore.mint.nn.functional.silu

Computes Sigmoid Linear Unit of input element-wise.

Ascend

None

mindspore.mint.nn.functional.softmax

Applies the Softmax operation to the input tensor on the specified axis.

Ascend

None

mindspore.mint.nn.functional.softplus

Applies softplus function to input element-wise.

Ascend

None

mindspore.mint.nn.functional.softshrink

Soft Shrink activation function.

Ascend

None

mindspore.mint.nn.functional.tanh

Computes hyperbolic tangent of input element-wise.

Ascend

None

Linear functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.linear

Applies the dense connected operation to the input.

Ascend

This is an experimental API that is subject to change or deletion. In PYNATIVE mode, if bias is not 1D, the input cannot be greater than 6D.

Dropout functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.dropout

During training, randomly zeroes some of the elements of the input tensor with probability p from a Bernoulli distribution.

Ascend

None

Sparse functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.embedding

Retrieve the word embeddings in weight using indices specified in input.

Ascend

On Ascend, the behavior is unpredictable when the value of input is invalid.

mindspore.mint.nn.functional.one_hot

Computes a one-hot tensor.

Ascend

None

Loss Functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.binary_cross_entropy

Computes the binary cross entropy(Measure the difference information between two probability distributions) between predictive value input and target value target.

Ascend

The value of input must range from 0 to l.

mindspore.mint.nn.functional.binary_cross_entropy_with_logits

Adds sigmoid activation function to input as logits, and uses this logits to compute binary cross entropy between the logits and the target.

Ascend

None

mindspore.mint.nn.functional.l1_loss

Calculate the mean absolute error between the input value and the target value.

Ascend

None

mindspore.mint.nn.functional.mse_loss

Calculates the mean squared error between the predicted value and the label value.

Ascend

None

Vision functions

API Name

Description

Supported Platforms

Warning

mindspore.mint.nn.functional.grid_sample

Given an input and a flow-field grid, computes the output using input values and pixel locations from grid.

Ascend

None

mindspore.mint.nn.functional.pad

Pads the input tensor according to the pad.

Ascend

circular mode has poor performance and is not recommended.

mindspore.mint.optim

API Name

Description

Supported Platforms

Warning

mindspore.mint.optim.AdamW

Implements Adam Weight Decay algorithm.

Ascend

This is an experimental optimizer API that is subject to change. This module must be used with lr scheduler module in LRScheduler Class . For Ascend, it is only supported on platforms above Atlas A2.

mindspore.mint.linalg

Inverses

API Name

Description

Supported Platforms

Warning

mindspore.mint.linalg.inv

Compute the inverse of the input matrix.

Ascend

None

mindspore.mint.special

Pointwise Operations

API Name

Description

Supported Platforms

Warning

mindspore.mint.special.erfc

Computes the complementary error function of input element-wise.

Ascend

None

mindspore.mint.special.expm1

Returns exponential then minus 1 of a tensor element-wise.

Ascend

None

mindspore.mint.special.log1p

Returns the natural logarithm of one plus the input tensor element-wise.

Ascend

None

mindspore.mint.special.log_softmax

Applies the Log Softmax function to the input tensor on the specified axis.

Ascend

None

mindspore.mint.special.round

Returns half to even of a tensor element-wise.

Ascend

None

mindspore.mint.special.sinc

Computes the normalized sinc of input.

Ascend

None

mindspore.mint.distributed

API Name

Description

Supported Platforms

Warning

mindspore.mint.distributed.init_process_group

Init collective communication lib.

Ascend

None

mindspore.mint.distributed.destroy_process_group

Destroy the user collective communication group.

Ascend

None

mindspore.mint.distributed.get_rank

Get the rank ID for the current device in the specified collective communication group.

Ascend

None

mindspore.mint.distributed.get_world_size

Get the rank size of the specified collective communication group.

Ascend

None