mindspore_gl.nn.TAGConv
- class mindspore_gl.nn.TAGConv(in_feat_size: int, out_feat_size: int, num_hops: int = 2, bias: bool = True, activation=None)[source]
Topology adaptation graph convolutional layer. From the paper Topology Adaptive Graph Convolutional Networks .
\[H^{K} = {\sum}_{k=0}^K (D^{-1/2} A D^{-1/2})^{k} X {\Theta}_{k}\]where \({\Theta}_{k}\) represents a linear weight to add the results of different hop counts.
- Parameters
- Inputs:
x (Tensor) - The input node features. The shape is \((N, D_{in})\) where \(N\) is the number of nodes, and \(D_{in}\) should be equal to in_feat_size in Args.
in_deg (Tensor) - In degree for nodes. The shape is \((N, )\) where \(N\) is the number of nodes.
out_deg (Tensor) - Out degree for nodes. The shape is \((N, )\) where \(N\) is the number of nodes.
g (Graph) - The input graph.
- Outputs:
Tensor, output node features with shape of \((N, D_{out})\), where \((D_{out})\) should be the same as out_feat_size in Args.
- Raises
- Supported Platforms:
Ascend
GPU
Examples
>>> import mindspore as ms >>> from mindspore_gl.nn import TAGConv >>> from mindspore_gl import GraphField >>> n_nodes = 4 >>> n_edges = 7 >>> feat_size = 4 >>> src_idx = ms.Tensor([0, 1, 1, 2, 2, 3, 3], ms.int32) >>> dst_idx = ms.Tensor([0, 0, 2, 1, 3, 0, 1], ms.int32) >>> ones = ms.ops.Ones() >>> feat = ones((n_nodes, feat_size), ms.float32) >>> graph_field = GraphField(src_idx, dst_idx, n_nodes, n_edges) >>> in_degree = ms.Tensor([3, 2, 1, 1], ms.int32) >>> out_degree = ms.Tensor([1, 2, 1, 2], ms.int32) >>> tagconv = TAGConv(in_feat_size=4, out_feat_size=2, activation=None, num_hops=3) >>> res = tagconv(feat, in_degree, out_degree, *graph_field.get_graph()) >>> print(res.shape) (4, 2)