mindspore.ops.LogSoftmax
- class mindspore.ops.LogSoftmax(axis=- 1)[source]
Log Softmax activation function.
Refer to
mindspore.ops.log_softmax()
for more detail.- Supported Platforms:
Ascend
GPU
CPU
Examples
>>> logits = Tensor(np.array([1, 2, 3, 4, 5]), mindspore.float32) >>> log_softmax = ops.LogSoftmax() >>> output = log_softmax(logits) >>> print(output) [-4.4519143 -3.4519143 -2.4519143 -1.4519144 -0.4519144]