PyTorch Autograd Wrappers#
Simple Neighborhood Aggregator (SAGEConv)#
|
PyTorch autograd function for simple aggregation using node features in an node-to-node reduction (n2n) while concatenating the original features of output at the end (agg_concat). |
Graph Attention (GATConv/GATv2Conv)#
|
PyTorch autograd function for a multi-head attention layer (GAT-like) without using cudnn (mha_gat) in a node-to-node reduction (n2n). |
|
PyTorch autograd function for a multi-head attention layer (GAT-like) without using cudnn (mha_gat_v2) with an activation prior to the dot product but none afterwards in a node-to-node reduction (n2n). |
Heterogenous Aggregator using Basis Decomposition (RGCNConv)#
PyTorch autograd function for node-to-node RGCN-like basis regularized aggregation, with features being transformed after (post) this aggregation. |