Transformers are Graph Neural Networks
My engineering friends often ask me: deep learning on graphs sounds great, but are there any real applications?
transformers graph-neural-networks natural-language-processing article

While Graph Neural Networks are used in recommendation systems at Pinterest, Alibaba and Twitter, a more subtle success story is the Transformer architecture, which has taken the NLP world by storm. Through this post, I want to establish a link between Graph Neural Networks (GNNs) and Transformers. I'll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we can work together to drive future progress. Let's start by talking about the purpose of model architectures—representation learning.

Don't forget to tag @chaitjo in your comment, otherwise they may not be notified.

Authors community post
Research Engineer at A*STAR, working on Graph Neural Networks
Share this project
Similar projects
G5: A Universal GRAPH-BERT for Graph-to-Graph Transfer
Further investigate the graph-to-graph transfer of a universal GRAPH-BERT for graph representation learning across different graph datasets
Project Insight is designed to create NLP as a service with code base for both front end GUI (streamlit) and backend server (FastAPI) the usage of ...
The Transformer Family
This post presents how the vanilla Transformer can be improved for longer-term attention span, less memory and computation consumption, RL task solving, ...
Anti-Patterns in NLP (8 types of NLP idiots)
A talk which discusses the recurring industrial problems in making NLP solutions.