MindSpore
Design
Functional Differential Programming
Distributed Training Design
MindSpore IR (MindIR)
Second Order Optimizer
Design of Visualization↗
Glossary
Specification
Benchmarks
Network List↗
API List
Syntax Support
API
mindspore
mindspore.amp
mindspore.common.initializer
mindspore.communication
mindspore.dataset
mindspore.dataset.audio
mindspore.dataset.config
mindspore.dataset.text
mindspore.dataset.transforms
mindspore.dataset.vision
mindspore.mindrecord
mindspore.nn
mindspore.nn.probability
mindspore.nn.transformer
mindspore.numpy
mindspore.ops
Operator Primitives
Decorators
Neural Network Layer Operators
Neural Network
Loss Function
Activation Function
Optimizer
mindspore.ops.Adam
mindspore.ops.AdamWeightDecay
mindspore.ops.AdaptiveAvgPool2D
mindspore.ops.ApplyAdadelta
mindspore.ops.ApplyAdagrad
mindspore.ops.ApplyAdagradDA
mindspore.ops.ApplyAdagradV2
mindspore.ops.ApplyAdaMax
mindspore.ops.ApplyAddSign
mindspore.ops.ApplyCenteredRMSProp
mindspore.ops.ApplyFtrl
mindspore.ops.ApplyGradientDescent
mindspore.ops.ApplyMomentum
mindspore.ops.ApplyPowerSign
mindspore.ops.ApplyProximalAdagrad
mindspore.ops.ApplyProximalGradientDescent
mindspore.ops.ApplyRMSProp
mindspore.ops.LARSUpdate
mindspore.ops.SparseApplyAdagrad
mindspore.ops.SparseApplyAdagradV2
mindspore.ops.SparseApplyProximalAdagrad
mindspore.ops.SGD
mindspore.ops.SparseApplyFtrl
mindspore.ops.SparseApplyFtrlV2
Distance Function
Sampling Operator
Image Processing
Text Processing
Mathematical Operators
Tensor Operation Operator
Parameter Operation Operator
Data Operation Operator
Communication Operator
Debugging Operator
Sparse Operator
Frame Operators
Operator Information Registration
Customizing Operator
mindspore.ops.function
mindspore.rewrite
mindspore.scipy
mindspore.boost
C++ API↗
API Mapping
PyTorch and MindSpore API Mapping Table
TensorFlow and MindSpore API Mapping Table
Migration Guide
Overview
Environment Preparation and Information Acquisition
Model Analysis and Preparation
Constructing MindSpore Network
Debugging and Tuning
Network Migration Debugging Example
FAQs
Differences Between MindSpore and PyTorch
Using Third-party Operator Libraries Based on Customized Interfaces
FAQ
Installation
Data Processing
Implement Problem
Network Compilation
Operators Compile
Migration from a Third-party Framework
Performance Tuning
Precision Tuning
Distributed Configuration
Inference
Feature Advice
RELEASE NOTES
Release Notes
MindSpore
»
mindspore.ops
»
mindspore.ops.SparseApplyAdagrad
View page source
mindspore.ops.SparseApplyAdagrad
class
mindspore.ops.
SparseApplyAdagrad
(
lr
,
update_slots
=
True
,
use_locking
=
False
)
[source]
Deprecated