mindspore.scipy.linalg.lu
- mindspore.scipy.linalg.lu(a, permute_l=False, overwrite_a=False, check_finite=True)[source]
Compute pivoted LU decomposition of a general matrix.
The decomposition is:
\[A = P L U\]where \(P\) is a permutation matrix, \(L\) lower triangular with unit diagonal elements, and \(U\) upper triangular.
Note
lu is not supported on Windows platform yet.
Only float32, float64, int32, int64 are supported Tensor dtypes.
If Tensor with dtype int32 or int64 is passed, it will be cast to mstype.float64.
- Parameters
a (Tensor) – a \((M, N)\) matrix to decompose. Note that if the input tensor is not a float, then it will be cast to mstype.float32.
permute_l (bool, optional) – Perform the multiplication \(P L\) (Default: do not permute). Default:
False
.overwrite_a (bool, optional) – Whether to overwrite data in \(a\) (may improve performance). Default:
False
.check_finite (bool, optional) – Whether to check that the input matrix contains only finite numbers. Disabling may give a performance gain, but may result in problems (crashes, non-termination) if the inputs do contain infinities or NaNs. Default:
True
.
- Returns
If permute_l == False
Tensor, \((M, M)\) permutation matrix.
Tensor, \((M, K)\) lower triangular or trapezoidal matrix with unit diagonal. \(K = min(M, N)\).
Tensor, \((K, N)\) upper triangular or trapezoidal matrix.
If permute_l == True
Tensor, \((M, K)\) permuted L matrix. \(K = min(M, N)\).
Tensor, \((K, N)\) upper triangular or trapezoidal matrix.
- Supported Platforms:
GPU
CPU
Examples
>>> import numpy as onp >>> from mindspore import Tensor >>> from mindspore.scipy.linalg import lu >>> a = Tensor(onp.array([[2, 5, 8, 7], [5, 2, 2, 8], [7, 5, 6, 6], [5, 4, 4, 8]]).astype(onp.float64)) >>> p, l, u = lu(a) >>> print(p) [[0 1 0 0] [0 0 0 1] [1 0 0 0] [0 0 1 0]] >>> print(l) [[ 1. 0. 0. 0. ] [ 0.2857143 1. 0. 0. ] [ 0.71428573 0.12 1. 0. ] [ 0.71428573 -0.44 -0.46153846 1. ]] >>> print(u) [[ 7. 5. 6. 6. ] [ 0. 3.57142854 6.28571415 5.28571415] [ 0. 0. -1.03999996 3.07999992] [ 0. -0. -0. 7.46153831]]