MindSpore 2.4 Officially Released

Native Support for Supernodes and Upgraded Distributed Parallel Capabilities

Knowledge Map

Mastering MindSpore: From Beginner to Advanced

Tutorials

From Quickstart to Experts

MindSpore Activities

Community Communication and Sharing

MindSpore

Native Distributed Model Training

Provides multiple parallel capabilities for foundation model training, as well as easy-to-use APIs for configuring distributed policies for foundation models, helping developers quickly implement distributed training of high-performance foundation models.

AI4S Converged Computing Framework

Fully Unleashing Hardware Potential

Quick Deployment in All Scenarios

Repositories

Ecosystem

Provides open source AI research projects, case collections, and task-specific SOTA models and derivatives Go to view all projects

Achievements

0
User
0
Star
0
Issue
0
PR

Join MindSpore to contribute

More Details

Community Partners