mindsponge.cell
API Name |
Description |
Supported Platforms |
This is an implementation of multihead attention in the paper Attention is all you need. |
|
|
This is an implementation of global gated self attention in the paper Highly accurate protein structure prediction with AlphaFold. |
|
|
Invariant Point attention module. |
|
|
MSA column-wise gated self attention. |
|
|
MSA column global attention. |
|
|
MSA row attention. |
|
|
Computing the correlation of the input tensor along its second dimension, the computed correlation could be used to update the correlation features(e.g. |
|
|
This is 2-layer MLP where the intermediate layer expands number of channels of the input by a factor(num_intermediate_factor). |
|
|
Triangle attention. |
|
|
Triangle multiplication layer. |
|