Attention mechanism, which for each input, weighs the relevance of every input and draws information from them accordingly when producing the output.

Overview

Illustrated: Self-Attention
Step-by-step guide to self-attention with illustrations and code.
self-attention attention pytorch transformers
Attention? Attention!
In this post, we are gonna look into how attention was invented, and various attention mechanisms and models, such as transformer and SNAIL.
attention self-attention pointer-network recurrent-neural-networks

Tutorials

Attention Mechanism
Main concepts behind Attention, including an implementation of a sequence-to-sequence Attention model, followed by the application of Attention in ...
attention self-attention tutorial article

Libraries

BertViz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
interpretability visualization bert attention
Table of Contents
Share a project
Share something you or the community has made with ML.
Topic experts
Share