SlideShare a Scribd company logo
Edge Representation Learning with Hypergraphs
Jaehyeong Jo1*, Jinheon Baek1*, Seul Lee1*,
Dongki Kim1, Minki Kang1, Sung Ju Hwang1,2
(*: equal contribution)
KAIST1, AITRICS2, South Korea
1
3
2
4
A B
C
D
𝑮𝑮 = ( 𝑿𝑿, 𝑴𝑴, 𝑬𝑬 ) Edge Feature
Node Feature
Incidence Matrix
1 2 3 4
A
B
C
D
A
B
C
D
1
2
3
4
Graphs: Nodes, Edges and Incidence
Graphs can be represented by the triplet:
Node feature, incidence matrix, and edge feature.
𝑿𝑿
𝑴𝑴
𝑬𝑬
Importance of Learning Edges
(a) Tylenol (Beneficial)
(b) NAPQI (Toxic) (c) Twitter network
Previous works focus on accurately representing the nodes, largely
overlooking edges which are essential components of a graph.
http://allthingsgraphed.com/2014/11/02/twitter-friends-network/
Learning Edge Representation via Nodes
Node-to-Node message passing
A B
C
D
1
2
3
4
Edge-to-Node message passing
Previous works have used edge features as auxiliary information to augment node
features, which implicitly capture the edge information in the node representations.
Edge HyperGraph Neural Network (EHGNN)
HyperCluster
HyperDrop
Input Graph
1
3
2
4
A B
C
D
B
A
C
D
1
2
3
4
Output Graph
Global Edge
Representation
Message
Passing
Node-to-Hyperedge
Edge-to-Node DHT
Dual Hypergraph Transformation (DHT)
Message
Passing
We propose a novel edge representation learning scheme using
Dual Hypergraph Transformation, and two edge pooling methods, namely
HyperCluster and HyperDrop.
Dual Hypergraph Transformation
Dual Hypergraph Transformation
Input Graph
𝑮𝑮 = (𝑿𝑿, 𝑴𝑴, 𝑬𝑬)
Dual Hypergraph
𝑮𝑮∗
= (𝑿𝑿∗
, 𝑴𝑴∗
, 𝑬𝑬∗
)
(a) (b)
(d) (c)
(a) Edge-to-Node
(d) Node-to-Edge
(b) Node-to-Hyperedge
(c) Hyperedge-to-Node
3
4
2
1 D
5
1
3
2
4
A B
C
D
5 5
1
3
2
4
A B
D
C
B
A
C
1
2
3
4
5
𝑬𝑬
A
𝑴𝑴
B
C
D
1 2 3 4 5
𝑿𝑿
A
B
C
D
𝑬𝑬∗
= 𝑿𝑿
A
B
C
D
A
𝑴𝑴∗
= 𝑴𝑴𝑻𝑻
B
C
D
1 2 3 4 5
1
2
3
4
5
𝑿𝑿∗
= 𝑬𝑬
We represent edges as nodes in a hypergraph, which allows us to apply any off-the-
shelf message-passing schemes designed for node-level representation learning.
Message-Passing on the Dual Hypergraph
Message-passing cost on the dual hypergraph is equal to the message-passing
cost on the original graph as 𝑶𝑶(𝑬𝑬).
DHT DHT
Message-Passing
Dual Hypergraph
We can perform message-passing between edges of a graph, by performing
message-passing between nodes of its dual hypergraph.
Edge Pooling: HyperCluster & HyperDrop
HyperCluster HyperDrop Output Graph
Global Edge
Representations
We propose two novel graph pooling methods to obtain compact graph-level
edge representations, namely Hypercluster and HyperDrop.
Representing each edge well alone is insufficient in obtaining an accurate
representation of the entire graph.
Experiments
• Graph Reconstruction
• Graph Generation
• Graph Classification
• Node Classification
: Generate a valid graph with desired properties.
: Reconstruct node and edge features of a given graph from their
pooled representations.
: Predict the label of a given graph.
: Predict the labels of the node of a given graph.
Graph Reconstruction
Figure: Graph reconstruction results on the ZINC molecule (left) and synthetic (right) datasets.
(a) Original
(v) R-GCN + GMPool
(b) MPNN + GMPool
(d) HyperCluster (Ours)
Accurately representing edges is crucial for graph reconstruction tasks. EHGNN with
HyperCluster yields incomparably high performance compared to the baselines.
Graph Reconstruction: Compression
Figure: Relative size of the representation after pooling to the original graph.
We validated the effectiveness of HyperCluster in dense graph compression,
in which our method is able to obtain highly compact but accurate representation.
Graph Generation
Figure: Graph generation results on MolGAN (left) and MARS (right).
EHGNN frameworks obtains significantly improved generation performance,
with both MolGAN and MARS architectures.
Graph Classification
Figure: Graph classification results on test sets.
EHGNN with HyperDrop outperforms all the hierarchical pooling baselines, and
when paired with GMT, obtains the best performance on most of the datasets.
Graph Classification: Examples
Figure: HyperDrop results on COLLAB dataset.
HyperDrop accurately identifies the task relevant edges, which leads to dividing
the large graph into connected components for effective message passing.
Node Classification
Figure: Node classification results on Cora (left) and Citeseer (right) datasets.
HyperDrop alleviates the over-smoothing problem of deep GNNs on semi-
supervised node classification tasks by identifying task relevant edges.
Conclusion
• We introduce a novel edge representation learning scheme using Dual Hypergraph
Transformation, which we can apply off-the-shelf message passing schemes
designed for node-level representation learning.
• We propose a novel edge pooling methods for graph-level representation learning,
to overcome the limitations of existing node-based pooling methods.
• We validate our methods on graph reconstruction, generation, and classification
tasks, on which we largely outperform existing graph representation learning
methods.
Thank you.
Contact information:
Jaehyeong Jo
harryjo97@kaist.ac.kr

More Related Content

PPS
Funzione esponenziale
PPTX
презентация обучене на деца със соп
PDF
Mca syllabus
PPT
Аз обичам България
PPT
съществително име. прилагателно име
PPTX
DDGK: Learning Graph Representations for Deep Divergence Graph Kernels
PPTX
NS-CUK Joint Journal Club : S.T.Nguyen, Review on "Graph Neural Networks for ...
PDF
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI
Funzione esponenziale
презентация обучене на деца със соп
Mca syllabus
Аз обичам България
съществително име. прилагателно име
DDGK: Learning Graph Representations for Deep Divergence Graph Kernels
NS-CUK Joint Journal Club : S.T.Nguyen, Review on "Graph Neural Networks for ...
J. Park, H. Shim, AAAI 2022, MLILAB, KAISTAI

Similar to Edge Representation Learning with Hypergraphs (20)

PDF
Learning Graph Representation for Data-Efficiency RL
PDF
Laplacian-regularized Graph Bandits
PPTX
EIS_REVIEW_1.pptx
PPTX
Graph R-CNN for Scene Graph Generation
PDF
A Subgraph Pattern Search over Graph Databases
PPTX
Kailash(13EC35032)_mtp.pptx
PPTX
[NS][Lab_Seminar_250421]SignGraph: A Sign Sequence is Worth Graphs of Nodes.pptx
PDF
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
PDF
NS-CUK Seminar: S.T.Nguyen, Review on "Do Transformers Really Perform Bad for...
PDF
Bridging knowledge graphs_to_generate_scene_graphs
PPTX
[NS][Lab_Seminar_240819]Re_PolyWorld.pptx
PDF
[Paper] GIRAFFE: Representing Scenes as Compositional Generative Neural Featu...
PDF
Ijcnc050213
PPT
An Introduction to Graph Databases
PPTX
Colloquium.pptx
PDF
An improved graph drawing algorithm for email networks
PDF
Scaling PageRank to 100 Billion Pages
PPT
Network coding
Learning Graph Representation for Data-Efficiency RL
Laplacian-regularized Graph Bandits
EIS_REVIEW_1.pptx
Graph R-CNN for Scene Graph Generation
A Subgraph Pattern Search over Graph Databases
Kailash(13EC35032)_mtp.pptx
[NS][Lab_Seminar_250421]SignGraph: A Sign Sequence is Worth Graphs of Nodes.pptx
A STUDY AND ANALYSIS OF DIFFERENT EDGE DETECTION TECHNIQUES
NS-CUK Seminar: S.T.Nguyen, Review on "Do Transformers Really Perform Bad for...
Bridging knowledge graphs_to_generate_scene_graphs
[NS][Lab_Seminar_240819]Re_PolyWorld.pptx
[Paper] GIRAFFE: Representing Scenes as Compositional Generative Neural Featu...
Ijcnc050213
An Introduction to Graph Databases
Colloquium.pptx
An improved graph drawing algorithm for email networks
Scaling PageRank to 100 Billion Pages
Network coding
Ad

More from MLAI2 (20)

PDF
Meta Learning Low Rank Covariance Factors for Energy-Based Deterministic Unce...
PDF
Online Hyperparameter Meta-Learning with Hypergradient Distillation
PDF
Online Coreset Selection for Rehearsal-based Continual Learning
PDF
Representational Continuity for Unsupervised Continual Learning
PDF
Sequential Reptile_Inter-Task Gradient Alignment for Multilingual Learning
PDF
Skill-Based Meta-Reinforcement Learning
PDF
Hit and Lead Discovery with Explorative RL and Fragment-based Molecule Genera...
PDF
Mini-Batch Consistent Slot Set Encoder For Scalable Set Encoding
PDF
Task Adaptive Neural Network Search with Meta-Contrastive Learning
PDF
Federated Semi-Supervised Learning with Inter-Client Consistency & Disjoint L...
PDF
Meta-GMVAE: Mixture of Gaussian VAE for Unsupervised Meta-Learning
PDF
Accurate Learning of Graph Representations with Graph Multiset Pooling
PDF
Contrastive Learning with Adversarial Perturbations for Conditional Text Gene...
PDF
Clinical Risk Prediction with Temporal Probabilistic Asymmetric Multi-Task Le...
PDF
MetaPerturb: Transferable Regularizer for Heterogeneous Tasks and Architectures
PDF
Adversarial Self-Supervised Contrastive Learning
PDF
Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph Link Pr...
PDF
Neural Mask Generator : Learning to Generate Adaptive Word Maskings for Langu...
PDF
Cost-effective Interactive Attention Learning with Neural Attention Process
PDF
Adversarial Neural Pruning with Latent Vulnerability Suppression
Meta Learning Low Rank Covariance Factors for Energy-Based Deterministic Unce...
Online Hyperparameter Meta-Learning with Hypergradient Distillation
Online Coreset Selection for Rehearsal-based Continual Learning
Representational Continuity for Unsupervised Continual Learning
Sequential Reptile_Inter-Task Gradient Alignment for Multilingual Learning
Skill-Based Meta-Reinforcement Learning
Hit and Lead Discovery with Explorative RL and Fragment-based Molecule Genera...
Mini-Batch Consistent Slot Set Encoder For Scalable Set Encoding
Task Adaptive Neural Network Search with Meta-Contrastive Learning
Federated Semi-Supervised Learning with Inter-Client Consistency & Disjoint L...
Meta-GMVAE: Mixture of Gaussian VAE for Unsupervised Meta-Learning
Accurate Learning of Graph Representations with Graph Multiset Pooling
Contrastive Learning with Adversarial Perturbations for Conditional Text Gene...
Clinical Risk Prediction with Temporal Probabilistic Asymmetric Multi-Task Le...
MetaPerturb: Transferable Regularizer for Heterogeneous Tasks and Architectures
Adversarial Self-Supervised Contrastive Learning
Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph Link Pr...
Neural Mask Generator : Learning to Generate Adaptive Word Maskings for Langu...
Cost-effective Interactive Attention Learning with Neural Attention Process
Adversarial Neural Pruning with Latent Vulnerability Suppression
Ad

Recently uploaded (20)

PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PPTX
cloud_computing_Infrastucture_as_cloud_p
PPTX
Spectroscopy.pptx food analysis technology
PDF
Heart disease approach using modified random forest and particle swarm optimi...
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PPTX
OMC Textile Division Presentation 2021.pptx
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
Unlocking AI with Model Context Protocol (MCP)
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Mushroom cultivation and it's methods.pdf
PPTX
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
PDF
A comparative study of natural language inference in Swahili using monolingua...
PPT
Teaching material agriculture food technology
PPTX
Digital-Transformation-Roadmap-for-Companies.pptx
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PPTX
Tartificialntelligence_presentation.pptx
PDF
Network Security Unit 5.pdf for BCA BBA.
PDF
Agricultural_Statistics_at_a_Glance_2022_0.pdf
Mobile App Security Testing_ A Comprehensive Guide.pdf
cloud_computing_Infrastucture_as_cloud_p
Spectroscopy.pptx food analysis technology
Heart disease approach using modified random forest and particle swarm optimi...
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
OMC Textile Division Presentation 2021.pptx
Group 1 Presentation -Planning and Decision Making .pptx
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Encapsulation_ Review paper, used for researhc scholars
Unlocking AI with Model Context Protocol (MCP)
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Mushroom cultivation and it's methods.pdf
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
A comparative study of natural language inference in Swahili using monolingua...
Teaching material agriculture food technology
Digital-Transformation-Roadmap-for-Companies.pptx
Building Integrated photovoltaic BIPV_UPV.pdf
Tartificialntelligence_presentation.pptx
Network Security Unit 5.pdf for BCA BBA.
Agricultural_Statistics_at_a_Glance_2022_0.pdf

Edge Representation Learning with Hypergraphs

  • 1. Edge Representation Learning with Hypergraphs Jaehyeong Jo1*, Jinheon Baek1*, Seul Lee1*, Dongki Kim1, Minki Kang1, Sung Ju Hwang1,2 (*: equal contribution) KAIST1, AITRICS2, South Korea
  • 2. 1 3 2 4 A B C D 𝑮𝑮 = ( 𝑿𝑿, 𝑴𝑴, 𝑬𝑬 ) Edge Feature Node Feature Incidence Matrix 1 2 3 4 A B C D A B C D 1 2 3 4 Graphs: Nodes, Edges and Incidence Graphs can be represented by the triplet: Node feature, incidence matrix, and edge feature. 𝑿𝑿 𝑴𝑴 𝑬𝑬
  • 3. Importance of Learning Edges (a) Tylenol (Beneficial) (b) NAPQI (Toxic) (c) Twitter network Previous works focus on accurately representing the nodes, largely overlooking edges which are essential components of a graph. http://allthingsgraphed.com/2014/11/02/twitter-friends-network/
  • 4. Learning Edge Representation via Nodes Node-to-Node message passing A B C D 1 2 3 4 Edge-to-Node message passing Previous works have used edge features as auxiliary information to augment node features, which implicitly capture the edge information in the node representations.
  • 5. Edge HyperGraph Neural Network (EHGNN) HyperCluster HyperDrop Input Graph 1 3 2 4 A B C D B A C D 1 2 3 4 Output Graph Global Edge Representation Message Passing Node-to-Hyperedge Edge-to-Node DHT Dual Hypergraph Transformation (DHT) Message Passing We propose a novel edge representation learning scheme using Dual Hypergraph Transformation, and two edge pooling methods, namely HyperCluster and HyperDrop.
  • 6. Dual Hypergraph Transformation Dual Hypergraph Transformation Input Graph 𝑮𝑮 = (𝑿𝑿, 𝑴𝑴, 𝑬𝑬) Dual Hypergraph 𝑮𝑮∗ = (𝑿𝑿∗ , 𝑴𝑴∗ , 𝑬𝑬∗ ) (a) (b) (d) (c) (a) Edge-to-Node (d) Node-to-Edge (b) Node-to-Hyperedge (c) Hyperedge-to-Node 3 4 2 1 D 5 1 3 2 4 A B C D 5 5 1 3 2 4 A B D C B A C 1 2 3 4 5 𝑬𝑬 A 𝑴𝑴 B C D 1 2 3 4 5 𝑿𝑿 A B C D 𝑬𝑬∗ = 𝑿𝑿 A B C D A 𝑴𝑴∗ = 𝑴𝑴𝑻𝑻 B C D 1 2 3 4 5 1 2 3 4 5 𝑿𝑿∗ = 𝑬𝑬 We represent edges as nodes in a hypergraph, which allows us to apply any off-the- shelf message-passing schemes designed for node-level representation learning.
  • 7. Message-Passing on the Dual Hypergraph Message-passing cost on the dual hypergraph is equal to the message-passing cost on the original graph as 𝑶𝑶(𝑬𝑬). DHT DHT Message-Passing Dual Hypergraph We can perform message-passing between edges of a graph, by performing message-passing between nodes of its dual hypergraph.
  • 8. Edge Pooling: HyperCluster & HyperDrop HyperCluster HyperDrop Output Graph Global Edge Representations We propose two novel graph pooling methods to obtain compact graph-level edge representations, namely Hypercluster and HyperDrop. Representing each edge well alone is insufficient in obtaining an accurate representation of the entire graph.
  • 9. Experiments • Graph Reconstruction • Graph Generation • Graph Classification • Node Classification : Generate a valid graph with desired properties. : Reconstruct node and edge features of a given graph from their pooled representations. : Predict the label of a given graph. : Predict the labels of the node of a given graph.
  • 10. Graph Reconstruction Figure: Graph reconstruction results on the ZINC molecule (left) and synthetic (right) datasets. (a) Original (v) R-GCN + GMPool (b) MPNN + GMPool (d) HyperCluster (Ours) Accurately representing edges is crucial for graph reconstruction tasks. EHGNN with HyperCluster yields incomparably high performance compared to the baselines.
  • 11. Graph Reconstruction: Compression Figure: Relative size of the representation after pooling to the original graph. We validated the effectiveness of HyperCluster in dense graph compression, in which our method is able to obtain highly compact but accurate representation.
  • 12. Graph Generation Figure: Graph generation results on MolGAN (left) and MARS (right). EHGNN frameworks obtains significantly improved generation performance, with both MolGAN and MARS architectures.
  • 13. Graph Classification Figure: Graph classification results on test sets. EHGNN with HyperDrop outperforms all the hierarchical pooling baselines, and when paired with GMT, obtains the best performance on most of the datasets.
  • 14. Graph Classification: Examples Figure: HyperDrop results on COLLAB dataset. HyperDrop accurately identifies the task relevant edges, which leads to dividing the large graph into connected components for effective message passing.
  • 15. Node Classification Figure: Node classification results on Cora (left) and Citeseer (right) datasets. HyperDrop alleviates the over-smoothing problem of deep GNNs on semi- supervised node classification tasks by identifying task relevant edges.
  • 16. Conclusion • We introduce a novel edge representation learning scheme using Dual Hypergraph Transformation, which we can apply off-the-shelf message passing schemes designed for node-level representation learning. • We propose a novel edge pooling methods for graph-level representation learning, to overcome the limitations of existing node-based pooling methods. • We validate our methods on graph reconstruction, generation, and classification tasks, on which we largely outperform existing graph representation learning methods.