latest | popular

Filter by
The Transformer … “Explained”?
An intuitive explanation of the Transformer by motivating it through the lens of CNNs, RNNs, etc.
transformers natural-language-processing article convolutional-neural-networks
A Visual Guide to Recurrent Layers in Keras
Understand how to use Recurrent Layers like RNN, GRU and LSTM in Keras with diagrams.
recurrent-neural-networks lstm keras tensorflow
Introduction to Neural Network Models of Cognition - Online Book
On-line interactive book introducing the history, theory, and math of Neural Network Models with Python, from a Cog Science perspective.
neural-networks convolutional-neural-networks recurrent-neural-networks deep-learning
The Sorcerer’s Apprentice Guide to Training LSTMs
Tricks of the trade for training Long Short-Term Memory networks.
recurrent-neural-networks lstm tips article
From GRU to Transformer
How recurrent units and self-attention are related to each other.
self-attention recurrent-neural-networks gated-recurrent-units transformers
Neural ODE Explained
Explains "Neural Ordinary Differential Equations", a very interesting idea came out in NIPS 2018.
recurrent-neural-networks differential-equation neural-ode ordinary-differential-equations
C++ Implementation of PyTorch Tutorials for Everyone
This repository provides tutorial code in C++ to learn PyTorch by building CNNs, RNNs, etc. Tutorials are divided into three sections based on complexity.
pytorch c++ torch torchscript
Attention? Attention!
In this post, we are gonna look into how attention was invented, and various attention mechanisms and models, such as transformer and SNAIL.
attention self-attention pointer-network recurrent-neural-networks
Recurrent Neural Networks: building GRU cells VS LSTM cells
What are the advantages of RNN’s over transformers? When to use GRU’s over LSTM? What are the equations of GRU really mean? How to build a GRU cell in ...
recurrent-neural-networks deep-learning machine-learning sequence-to-sequence
projects 1 - 10 of 30
Topic experts
Share a project
Share something you or the community has made with ML.