mindspore
Context
#include <context.h>
The Context class is used to store environment variables during execution, which has two derived classes: GlobalContext
and ModelContext
.
GlobalContext : Context
GlobalContext is used to store global environment variables during execution.
Static Public Member Function
GetGlobalContext
static std::shared_ptr<Context> GetGlobalContext();
Obtains the single instance of GlobalContext.
Returns
The single instance of GlobalContext.
SetGlobalDeviceTarget
static void SetGlobalDeviceTarget(const std::string &device_target);
Configures the target device.
Parameters
device_target
: target device to be configured, options arekDeviceTypeAscend310
,kDeviceTypeAscend910
.
GetGlobalDeviceTarget
static std::string GetGlobalDeviceTarget();
Obtains the configured target device.
Returns
The configured target device.
SetGlobalDeviceID
static void SetGlobalDeviceID(const unit32_t &device_id);
Configures the device ID.
Parameters
device_id
: the device ID to configure.
GetGlobalDeviceID
static uint32_t GetGlobalDeviceID();
Obtains the configured device ID.
Returns
The configured device ID.
ModelContext : Context
Static Public Member Function
Function |
Notes |
---|---|
|
Set AIPP configuration file path |
|
- Returns: The set AIPP configuration file path |
|
Set format of model inputs |
|
- Returns: The set format of model inputs |
|
Set shape of model inputs |
|
- Returns: The set shape of model inputs |
|
Set type of model outputs |
|
- Returns: The set type of model outputs |
|
Set precision mode of model |
|
- Returns: The set precision mode |
|
Set op select implementation mode |
|
- Returns: The set op select implementation mode |
Serialization
#include <serialization.h>
The Serialization class is used to summarize methods for reading and writing model files.
Static Public Member Function
LoadModel
static Graph LoadModel(const std::string &file, ModelType model_type);
Loads a model file from path.
Parameters
file
: the path of model file.model_type
: the Type of model file, options areModelType::kMindIR
,ModelType::kOM
.
Returns
An instance of
Graph
, used for storing graph data.
static Graph LoadModel(const void *model_data, size_t data_size, ModelType model_type);
Loads a model file from memory buffer.
Parameters
model_data
: a buffer filled by model file.data_size
: the size of the buffer.model_type
: the Type of model file, options areModelType::kMindIR
,ModelType::kOM
.
Returns
An instance of
Graph
, used for storing graph data.
Model
#include <model.h>
The Model class is used to define a MindSpore model, facilitating computational graph management.
Constructor and Destructor
explicit Model(const GraphCell &graph, const std::shared_ptr<Context> &model_context);
explicit Model(const std::vector<Output> &network, const std::shared_ptr<Context> &model_context);
~Model();
GraphCell
is a derivative of Cell
. Cell
is not available currently. GraphCell
can be constructed from Graph
, for example, Model model(GraphCell(graph))
。
Context
is used to store the model options during execution.
Public Member Functions
Build
Status Build();
Builds a model so that it can run on a device.
Returns
Status code.
Predict
Status Predict(const std::vector<MSTensor> &inputs, std::vector<MSTensor> *outputs);
Inference model.
Parameters
inputs
: avector
where model inputs are arranged in sequence.outputs
: output parameter, which is a pointer to avector
. The model outputs are filled in the container in sequence.
Returns
Status code.
GetInputs
std::vector<MSTensor> GetInputs();
Obtains all input tensors of the model.
Returns
The vector that includes all input tensors.
GetOutputs
std::vector<MSTensor> GetOutputs();
Obtains all output tensors of the model.
Returns
A
vector
that includes all output tensors.
Resize
Status Resize(const std::vector<MSTensor> &inputs, const std::vector<std::vector<int64_t>> &dims);
Resizes the shapes of inputs.
Parameters
inputs
: avector
that includes all input tensors in order.dims
: defines the new shapes of inputs, should be consistent withinputs
.
Returns
Status code.
CheckModelSupport
static bool CheckModelSupport(const std::string &device_type, ModelType model_type);
Checks whether the type of device supports the type of model.
Parameters
device_type
: device type,options areAscend310
,Ascend910
.model_type
: the Type of model file, options areModelType::kMindIR
,ModelType::kOM
.
Returns
Status code.
MSTensor
#include <types.h>
The MSTensor class defines a tensor in MindSpore.
Constructor and Destructor
MSTensor();
explicit MSTensor(const std::shared_ptr<Impl> &impl);
MSTensor(const std::string &name, DataType type, const std::vector<int64_t> &shape, const void *data, size_t data_len);
~MSTensor();
Static Public Member Function
CreateTensor
static MSTensor CreateTensor(const std::string &name, DataType type, const std::vector<int64_t> &shape,
const void *data, size_t data_len) noexcept;
Creates a MSTensor object, whose data need to be copied before accessed by Model
.
Parameters
name
: the name of theMSTensor
.type
: the data type of theMSTensor
.shape
: the shape of theMSTensor
.data
: the data pointer that points to allocated memory.data
: the length of the memory, in bytes.
Returns
An instance of
MStensor
.
CreateRefTensor
static MSTensor CreateRefTensor(const std::string &name, DataType type, const std::vector<int64_t> &shape, void *data,
size_t data_len) noexcept;
Creates a MSTensor object, whose data can be directly accessed by Model
.
Parameters
name
: the name of theMSTensor
.type
: the data type of theMSTensor
.shape
: the shape of theMSTensor
.data
: the data pointer that points to allocated memory.data
: the length of the memory, in bytes.
Returns
An instance of
MStensor
.
Public Member Functions
Name
const std::string &Name() const;
Obtains the name of the MSTensor
.
Returns
The name of the
MSTensor
.
DataType
enum DataType DataType() const;
Obtains the data type of the MSTensor
.
Returns
The data type of the
MSTensor
.
Shape
const std::vector<int64_t> &Shape() const;
Obtains the shape of the MSTensor
.
Returns
A
vector
that contains the shape of theMSTensor
.
ElementNum
int64_t ElementNum() const;
Obtains the number of elements of the MSTensor
.
Returns
The number of elements of the
MSTensor
.
Data
std::shared_ptr<const void> Data() const;
Obtains a shared pointer to the copy of data of the MSTensor
.
Returns
A shared pointer to the copy of data of the
MSTensor
.
MutableData
void *MutableData();
Obtains the pointer to the data of the MSTensor
.
Returns
The pointer to the data of the
MSTensor
.
DataSize
size_t DataSize() const;
Obtains the length of the data of the MSTensor
, in bytes.
Returns
The length of the data of the
MSTensor
, in bytes.
IsDevice
bool IsDevice() const;
Gets the boolean value that indicates whether the memory of MSTensor
is on device.
Returns
The boolean value that indicates whether the memory of
MSTensor
is on device.
Clone
MSTensor Clone() const;
Gets a deep copy of the MSTensor
.
Returns
A deep copy of the
MSTensor
.
operator==(std::nullptr_t)
bool operator==(std::nullptr_t) const;
Gets the boolean value that indicates whether the MSTensor
is valid.
Returns
The boolean value that indicates whether the
MSTensor
is valid.
CallBack
#include <ms_tensor.h>
The CallBack struct defines the call back function in MindSpore Lite.
KernelCallBack
using KernelCallBack = std::function<bool(std::vector<tensor::MSTensor *> inputs, std::vector<tensor::MSTensor *> outputs, const CallBackParam &opInfo)>
A function wrapper. KernelCallBack defines the pointer for callback function.
CallBackParam
A struct. CallBackParam defines input arguments for callback function.
Public Attributes
node_name
node_name
A string variable. Node name argument.
node_type
node_type
A string variable. Node type argument.