G5: A Universal GRAPH-BERT for Graph-to-Graph Transfer
Further investigate the graph-to-graph transfer of a universal GRAPH-BERT for graph representation learning across different graph datasets
graph-neural-networks bert g5 graph-bert graphs attention transformers natural-language-processing paper arxiv:2006.06183 representation-learning research

GRAPH-BERT provides an opportunity for transferring pre-trained models and learned graph representations across different tasks within the same graph dataset. In this paper, we will further investigate the graph-to-graph transfer of a universal GRAPH-BERT for graph representation learning across different graph datasets, and our proposed model is also referred to as the G5 for simplicity.

Don't forget to tag @jwzhanggy in your comment, otherwise they may not be notified.

Authors community post
Jiawei Zhang
Share this project
Similar projects
DeepRobust
A pytorch adversarial library for attack and defense methods on images and graphs.
Low-Dimensional Hyperbolic Knowledge Graph Embeddings
Low-dimensional knowledge graph embeddings that simultaneously capture hierarchical relations and logical patterns.
GraphNorm
A Principled Approach to Accelerating Graph Neural Network Training.
Graphein
Protein Graph Library