Graphormer=a General-propose Backbone for Graph Learning
Graphormer=a General-propose Backbone for Graph Learning
https://github.com/microsoft/Graphormer
TRANSFORMER BECOMES DOMINANT ON SEQUENCE DATA
Speech
Protein
Image
Grid Data (2D) :
Video
Molecule
Combinatorial Optimization
[1] Ke, Guolin, Di He, and Tie-Yan Liu. "Rethinking the Positional Encoding in Language Pre-training." ICLR(2021)
KEY INSIGHT: STRUCTURAL ENCODINGS
Set:
Self-Attention: Calculate Correlation Between Nodes…
= Pure Transformer
+ Spatial Encoding
+ Centrality Encoding
+ Edge Encoding
GRAPHORMER: SPATIAL ENCODING
+𝑏𝜙(𝑣𝑖,𝑣𝑗)
Spatial Position
3D Euclidean Distance
Unweighted Shortest Path
Edge Encoding:
𝒗𝒊
𝒗𝒋
𝒗𝒌
Special Cases
(a) (b)
KDD CUP 2021 – 1 ST PLACE AWARD
BIOASSAY
https://github.com/microsoft/Graphormer